Sunday, April 26, 2015

Six must-have features when storing data in the cloud


Six must-have features when storing data in the cloud

By /    

Cloud storage prices are flirting with free, and capacity limits are on the rise. Just this past March, Google announced its Cloud Storage Nearline service that costs a penny a month for 1GB of storage. Service providers are dropping prices mainly because they can; technology advances – particularly for data that doesn’t need to be accessed often or quickly – has significantly increased storage efficiency, making it possible for providers to offer next-to-nothing prices. But if enterprises are going to take public cloud storage services seriously, they’ll need more than vast bins of low-priced storage to make it work.
“I think the biggest misperception we get when we talk to customers about cloud storage in particular is they view it as complete outsourcing,” said Henry Baltazar, senior analyst serving infrastructure and operation professionals with Forrester, in a webinar. “They think of it as `Okay, if I pull out my credit card, my problems are going to go away.’ And there’s really nothing farther from the truth. You still have a lot of things you have to care about.”
As such, many cloud service providers are offering value-add features, while add-on products such as cloud storage gateways, file sync and share, and hybrid backup are also becoming popular. These are the kinds of features and functions that differentiate one cloud from another. And enterprises should carefully evaluate them when making their decisions. Among the most important considerations are:
Security – Managed users access, authentication, and encryption are essential to protecting data regardless of where it’s stored; this is especially true when moving data outside the confines of the corporate data center. Reporting features for compliance are also necessary.
Redundancy – Having data stored at different geographical locations for redundancy just makes good sense, even if that means a hybrid approach where data is stored in the data center and backed up in the cloud.
Disaster recovery – Essential in case of natural, machine, or human-driven events. Enterprises can adjust these features based on their tolerance for downtime and data loss. If that tolerance is close to zero – for, say, financial services firms that execute trades for clients – those companies must be prepared to pay for advanced capabilities.
Collaboration tools – Employees and partners need fast, simple access to files. Tools that let them drag and drop files in folders, share files with simple links, and manage access to files keep them productive.
Administrative tools – For enterprises to maintain control over their data, they’ll want to be able to do things like manage permissions, set policies, and establish expiration dates for files.
Mobile access – With the explosion of mobility in the enterprise, providing fast, secure access to cloud-based storage from a variety of devices is essential.

Friday, April 24, 2015

High-profile data breaches made most CEOs re-exam security programs



Posted on 24 April 2015.
There has been increased board- and C-level interest in information security programs in light of recent high-profile data breaches such as those affecting Sony, Anthem and JP Morgan, the results of a Netskope survey have revealed.

As the severity and consequences of data breaches intensify, Netskope surveyed a hundred information security professionals attending RSA Conference 2015 and found the majority of respondents’ board of directors and CEOs have taken active interest in understanding and improving their company’s security programs.

“As more information is disclosed and media follow every detail of mega breaches, there’s an incredible amount to learn,” said Sanjay Beri, co-founder and CEO, Netskope. “But for all the information available, we were curious to know if the impact of those breaches was enough for board members and CEOs to move the needle in the boardroom. I’m encouraged knowing that recent high-profile data breaches have incited conversations between board-level decision-makers and security teams, and action is being taken to prevent similar breaches.”

The survey – which provides a snapshot of how security decisions are currently being made – offered up these conclusions:

Elevated executive interest
In light of recent data breaches, Netskope found 13.6 percent of cloud app users currently use compromised account credentials at work. The impact of breaches often linger beyond the immediate attack, and executives are increasingly looking to mitigate that risk. The survey revealed that 69 percent of respondents’ CEOs or boards of directors have asked their security teams about specific security policies in the wake of recent high-profile breaches.

All technology up for discussion
Of those 69 percent of respondents, 28 percent of boards and CEOs have asked specifically about security of Cloud or Software as a Service (SaaS) technologies. 27 percent have asked about mobile device security and network security, demonstrating executives are not focused solely on one area of security, rather, they aim to gain a holistic understanding of security programs in general.

Walking the walk
Enterprises use an average of 730 cloud apps, however 90 percent of those apps are not enterprise-ready, according to the Netskope Cloud Confidence Index. As cloud app adoption continues to skyrocket, enterprises must have the visibility into the cloud technologies in use by their organizations. 65 percent of respondents report they have changed, or plan to change cloud-specific security methods since the Anthem security breach. More than half (52 percent) of respondents confirmed that cloud-specific security methods have changed as a direct result of CEO or board-level conversations.

Wednesday, April 22, 2015

The Key that Unlocks Esperanto

   

The Key that Unlocks Esperanto

Posted by on in Smart Encryption
     
As any student or traveler can attest, learning a new language is hard. How about making one up?
Esperanto will take over Lille, France this summer. Esperanto will be the backbone of discussions, debates and entertainment at this event in the north of France, a place quite prideful of its own language. The linguistic oddity that is Esperanto was one person’s vision in the 1880s of a unifying, global language. It has survived world wars, mistrust and the era of “OMG”. Unlike the languages bubbling out of “Star Wars” or “Lord of the Rings”, it’s not propelled by an entertainment component. Esperanto is also spoken by a relatively small number of folks.
b2ap3_thumbnail_Flag_of_Esperanto.svg.png
Can encryption key management learn something from the seldom spoken language of Esperanto? (Esperanto flag image courtesy of Wikipedia.)
I was drawn to reading up on Esperanto from my unabashedly nerdy love of code and programming languages. (The broad appeal of code has teachers pushing for students to “speak” in terms of hardware and software as a language education imperative.) In an industry plagued by made up norms and half-baked standardization, the security side of coding and programming shouldn’t lose sight of its intentions in creating a computerized parallel to Esperanto.
In encryption key management, a “language” I feel fairly fluent in, I fear that for many users we’ve replaced flow and communicative elements with a whole bunch of ways to swear and confuse. There is an inherent problem with mass, single instances in the security field when it comes to encryption keys. To get around that, we create multiple, obscured copies of keys to encrypt and decrypt information that needs to be secured. That is starting to add up to a lot of keys for some businesses – and a lot of stress. In looking at how businesses in the U.K. were handling the avalanche of keys, Techworld wrote: “once the bedrock of security, keys and certificates now elicit anxiety. This is perhaps not surprising given the growing number of attacks in which they have been compromised or undermined in a more general way by vulnerabilities such as last year’s Heartbleed. The average U.K. organization in the survey tended 25,500 keys and certificates, with 4 percent of IT staff saying they had no idea where all of this was kept.”
When a key becomes easy to use, it sometimes becomes too easy, opening up security gaps like those exposed recently with one RSA key. When it’s too hard you get things like a product we recently saw advertised that touted so many keys that it comes with a feature for a key manager manager. Or, more likely, end users and knowledge workers avoiding encryption whenever possible, including when it’s necessary.
As much as people fret and flaunt the encryption side of security, it’s the keys which make that function intelligent. Tilt the equation in either direction and you’ll nullify the reason your company is using encryption in the first place. Right now, we’re coming across a lot of businesses big and medium-sized going the route of the Key Management Interoperability Protocol, or KMIP. It’s not a perfect communications standard, but it at least allows for secured options within encryption practices. Languages that thrive tend to create new ways to share without removing the backbone of communication. How many slang terms do we use in mainstream discussions?
While subtle, our key management dialect could benefit from strengthening what works and easing on the standardization of faulty or abused methods. And you know what they say: you can’t encrypt in Esperanto. At least I think that’s what they say. My Esperanto is rusty.
Last modified on

Compliance officer awarded $1.5 million under SEC whistleblower program

Compliance officer awarded $1.5 million under SEC whistleblower program

The Securities and Exchange Commission announced an award Wednesday of about $1.5 million to a compliance professional who provided information for an enforcement action against the whistleblower’s company.
The award went to a compliance officer "who had a reasonable basis to believe that disclosure to the SEC was necessary to prevent imminent misconduct from causing substantial financial harm to the company or investors," the SEC said.
The whistleblower will receive between $1.4 million and $1.6 million, the agency said.
Wednesday's award is the second the SEC has made to an employee with internal audit or compliance responsibilities.
In September last year, the SEC awarded about $300,000 to a company audit and compliance employee who complained to the SEC after the company didn't act on the same information.
After that award, Sean McKessy, chief of the SEC’s whistleblower office, said employees who perform internal audit, compliance, and legal functions can be eligible for an SEC whistleblower award "if their companies fail to take appropriate, timely action on information they first reported internally.”
Whistleblower awards can range from 10 percent to 30 percent of the money collected in a successful enforcement action with penalties of more than a $1 million.
By law, the SEC has to protect the confidentiality of whistleblowers and can't disclose information that might directly or indirectly reveal their identities.
Since the launch of its whistleblower award program in 2011, the SEC has paid more than $50 million to 16 whistleblowers who provided "unique and useful information that contributed to a successful enforcement action."
Andrew Ceresney, chief of the SEC’s enforcement division, said in a statement Wednesday: “This compliance officer reported misconduct after responsible management at the entity became aware of potentially impending harm to investors and failed to take steps to prevent it.”
*     *     *
The SEC's redacted Order Determining Whistleblower Award Claim (Securities Exchange Act of 1934 Release No. 74781 and Whistleblower Award Proceeding File No. 2015-2) dated April 22, 2015 is here (pdf).
________
Richard L. Cassin is the publisher and editor of the FCPA Blog. He can be contacted here.
- See more at: http://www.fcpablog.com/blog/2015/4/22/compliance-officer-awarded-15-million-under-sec-whistleblowe.html#sthash.dMXHchRM.dpuf

Tuesday, April 21, 2015

PCI DSS Version 3.1 - What's New?


PCI DSS Version 3.1 - What's New?

Troy Leach of PCI Council Explains New Version of Standard

By , April 17, 2015.
  
Troy Leach, PCI Security Standards Council
Troy Leach, PCI Security Standards Council

Listen Now

00:00
00:00
00:11
The PCI Security Standards Council has just published a new version of the Payment Card Industry Data Security Standard that calls for ending the use of the outdated Secure Sockets Layer encryption protocol that can put payment data at risk. What must security leaders know about PCI 3.1 and its supporting guidance? Troy Leach of the PCI Council offers insights.

The reality about PCI version 3.1? It primarily boils down to removing one cryptography example three times from the published standard. But that small step indeed signals a giant leap forward in payment card security.

The new guidance removes the Secure Socket Layer encryption protocol, and early versions of Transport Layer Security, as examples of strong cryptography, and calls for use of a current, secure version of TLS.
It's unusual for the PCI Council to issue a mid-year update, but this one is critical, Leach says.
"We recognize that since the last time we published our standard in November of 2013, NIST and other subject matter experts have come out and said that the [SSL] protocol itself has been deprecated," Leach says. "So, we recognized that there was a need to move away from that example."
The PCI Council is giving covered entities until June of 2016 to complete the migration, and Leach encourages these organizations to start their risk assessment process now.
"We're asking the community to have the due diligence to do proper risk management of the situation, make an assessment of whether they are at risk, and then make their strategy progressive, so that they identify the top risks first, eliminate those, and then move forward," Leach says.
In an exclusive interview about PCI DSS 3.1, Leach discusses:


Leach also will be discussing PCI DSS version 3.1 at RSA Conference 2015 in San Francisco.
In his role as CTO and lead security standards architect for the PCI Council, Leach has developed and implemented a comprehensive quality assurance program. Before joining the council, he led the incident response program at American Express, where he reviewed more than 300 cases of account data compromises. Over the past 18 years, he has held positions in systems administration, network engineering, IT management, security assessment and forensic analytics.
Follow Tom Field on Twitter: @SecurityEditor

Monday, April 20, 2015

Target, MasterCard Settle Over Breach

Target, MasterCard Settle Over Breach

Retailer Offers Issuers a Total of Up to $19 Million

By , April 16, 2015.                                                               
Target, MasterCard Settle Over Breach          
 

Target has agreed to pay a total of up to $19 million to issuers of MasterCard payment cards over losses and expenses they incurred as a result of the retailer's massive 2013 breach.
See Also: Breaking Down Ease-of-Use Barriers to Log Data Analysis for Security
The settlement announced April 15 is contingent on issuers of at least 90 percent of the eligible MasterCard accounts accepting their offers by May 20. If sufficient issuers accept the offer, Target says they'll be paid by the end of June.
   
"This settlement provides our issuers a reasonable resolution of the Target data breach event," says Eileen Simon, MasterCard chief franchise integrity officer. "The timely reimbursement of costs and losses under the agreement delivers MasterCard issuers a faster and more certain resolution to the event, while reinforcing our commitment to maintain the integrity of industry security standards."
MasterCard, in a statement, says issuers that choose not to accept this offer will have their claims determined by MasterCard internal processes and may receive more or less than the amounts offered in this settlement, depending on various factors. Those include MasterCard's final determinations of their claims and the outcome of any litigation that Target might file to challenge claim awards to issuers outside of this settlement.
Target also is in negotiations with Visa for a breach-related settlement. "Visa takes very seriously our responsibility to work closely with its acquiring clients and Target to resolve this event," Visa spokesman Jake Standish says. "Visa continues to analyze all relevant information to ensure we reach a resolution that is accurate and fair to all Visa clients and participants in the payments system and we are committed to addressing and resolving this case expeditiously."

Reaction to Settlement

William Murray, an information security and technology consultant, says everyone benefits from the MasterCard settlement, even consumers. "The cost to Target is much less than that of extended litigation," he says, adding that the length for the two sides to reach a settlement seems reasonable. "I am pleasantly surprised at how quickly it has been done," he says. "Just clarifying the issues is time consuming. Arriving at an agreement in this time demonstrates good will on the part of all parties."
But Jim Nussle, chief executive of the Credit Union National Association, contends the settlement took too long. "It is about time that Target steps up to its responsibilities in this breach," Nussle says. "And it is long overdue for merchants to start living up to their responsibilities in protecting customers' sensitive information by adopting higher security standards."
Dan Berger, chief executive of the National Association of Federal Credit Unions, says the size of the MasterCard settlement was disappointing. "While we appreciate that the settlement attempts to hold Target somewhat accountable, we were hoping it would be more than just pennies on the dollar," Berger says. "We believe that this demonstrates the reason why Congress must act to protect consumers' financial information by enacting stronger standards and holding retailers and merchants directly accountable for their data breaches."
As Target and MasterCard announced their settlement, the House Energy and Commerce Committee passed a data breach notification and security bill that calls on companies to take "reasonable security measures and practices" to secure the personally identifiable information of customers (see National Data Breach Notification Bill Advances).
Target says the 2013 breach compromised at least 40 million payment cards and might have caused the pilfering of personal information from as many as 110 million people. The retailer has reported that its breach costs have totaled at least $252 million so far, with $90 million covered by insurance.
The retailer last month announced a pending $10 million settlement of a consumer lawsuit. Target and MasterCard did not immediately respond to requests for further comment.
Follow Eric Chabrow on Twitter: @GovInfoSecurity

Sunday, April 19, 2015

CMS Administrator: ‘We Need to Adjust and Make Things Easier’


CMS Administrator: ‘We Need to Adjust and Make Things Easier’

APR 17, 2015
      
We’ve only scratched the service on using technology to improve care, access and quality, the acting administrator of the Centers for Medicare and Medicaid Services said Thursday at HIMSS15, and to continue along that path three main actions need to be accomplished.
Speaking in Chicago at the annual conference, Andy Slavitt also called for names of organizations setting up barriers to interoperability. He said care providers and patients should feel all of the investments made in technology, what he called a care dividend. “The taxpayer has invested billions…We have stages and rules and measures,” he said. “What matters is can technology produce more time and capacity for care providers to improve care to patients.”
Also See: Data Blocking Hampers Interoperability, ONC Says
“We have a great need for modern infrastructure with healthcare,” Slavitt added as the second action point. “For health care to be truly as great as we deserve, it needs far better infrastructure in critical places.”
He said the healthcare system needs to adapt and learn using technology. “We can do with a little less innovation in shareables and wearables,” he said to applause, “and more focus on opportunities to improve healthcare infrastructure.”
The final point, Slavitt told attendees, is one of the greatest concerns—interoperability. He recalled a visit earlier this week to a qualified healthcare center with some of the greatest electronic medical records and quality control in the country. Yet, physicians couldn’t follow patients who left to go to a specialist. It is “not acceptable for taxpayers, it is not acceptable to us,” he said. “I’m asking a great deal more of [the] innovation system. It’s time to get down to business and advance the gains of the last five years.” In return, Slavitt said the government will get better at listening and adapting and being a “more solution oriented partner.”
“As you implement, we need to adjust and make things easier for you,” he added, and pledged for CMS to be clearer in its goals and expectations so “you know where to invest based on how we will reimburse.”
Naming Names
Slavitt said CMS wants to know of every example, small and large, when someone willfully or in some way sets up barriers to interoperability. “We want to know about them,” he said. “I’d like to know about them and I’d like to have conversations with people participating in that and understand what is going on.”
It is not so much a matter of technology as it is a matter of commitment, he added. “I want to make sure we are happy as a team and it is a public service to hear about the issues and confront them.”
Slavitt concluded that the industry is moving to a level where as you walk the halls of a clinic people are feeling improvements being made and he looked to the HIMSS audience to help. “It is going to be this group of folks and companies you represent and how well you work together,” he said. “We have momentum, but I fear if we don’t get urgent about it, it won’t happen quickly enough

Local merchants not ready for more secure ‘chip and PIN’ cards    

           


Massive data breaches at retailers like Home Depot and Target are finally pushing credit card issuers to adopt the technology known as “chip and PIN” which is much more secure than the magnetic swipe strips used in the United States.
The technology, which uses an embedded computer chip and a personal identification number, has been successful in Canada, Australia, Europe, and elsewhere. In the United States, a new industry standard is pushing widespread use of chip-based cards by October. Banks and merchants that fail to adopt the technology by then will become liable for the costs of fraud that result from data breaches, rather than issuers such as Visa and Mastercard.

Knowing this was on the horizon, I was curious to find out if Greater Boston was ready for this transition. To make this determination, I simply tried to make purchases with the chip portion of my credit card at every store I visited — and kept a chart of my experiences. At more than a dozen brand-name chains and local mom-and-pop merchants, I inserted the chip portion of the card into the terminal to see if it would process my payment.
In the majority of cases, the clerk behind the counter had no clue what I was doing. Once I saw that the chip wouldn’t allow me to buy my items, which happened with every transaction, except one in a Lexington sandwich shop , I asked the clerks if I could pay with the chip. They almost always looked annoyed and instructed me to swipe; I think they thought I had gone mad.
At a Watertown grocery store, the clerk told me, firmly, three times to use the swipe. At my local post office, they asked questions and were intrigued by how I was trying to pay.
At a RadioShack in Waltham, the clerk confidently told me “Yeah, I know what NFC is,” even though the chip-and-PIN technology has nothing to do with near field communications, which allows you to wave your card or smartphone to pay. A woman behind the counter at a local shoe store said, “We get lots of European customers who try to use their credit cards that way.”
Then there was the time I thought I could use my new chip card abroad. I was standing in the international terminal at Logan Airport waiting for my flight to London to board; it dawned on me that my new chip-enabled card had arrived without a PIN.
While that wouldn’t pose a problem in the US, in the United Kingdom, cards with chips are always used with a PIN. I called my bank’s toll-free number to get my PIN. I was told that I had already received the number in the mail — nearly 20 years earlier when I first got the card. That slip of paper had obviously long since disappeared.
My findings were that the Boston area is representative of the rest of the country. Hardly anyone is ready for chip-and-PIN. Merchants certainly aren’t, even though roughly 600 million chip-based cards will be sent to American customers by the end of this year according to the Smart Card Alliance, a group of financial, technology, and other companies promoting the adoption of chip technology.
Store clerks are not trained to accept them, and point-of-sale terminals don’t have the right software. I would guess that most people reading this haven’t received the most basic instruction on how to use them, even if they have already received the new card in the mail.
Smaller countries, with fewer banks and credit card issuers, have taken four years or more to make the transition from magnetic strips, so I believe it will be years before we are all paying with the chip portion of our cards. If you’ve received a chip-and-PIN card, don’t count on using it effortlessly anytime soon. Once we do make the leap, we should see some big changes — hopefully in the form of lower rates of credit card fraud and smaller effects from data breaches.

Joram Borenstein is a vice president at NICE Actimize, a company based in Boston that makes financial crime and fraud prevention software.

Thursday, April 16, 2015

EMEA: KPN pushes Blackphones for security, encryption 0

EMEA: KPN pushes Blackphones for security, encryption 0
KPN’s CIO Jaya Baloo wants everyone to “live long, laugh a lot and encrypt everything.”
Speaking at the International NCSC One Conference in The Hague, Baloo said encryption was essential for protecting the freedom of expression.
Baloo also praised the Blackphone, a secure smartphone designed by Silent Circle that can encrypt voice calls. On April 15, KPN became the first operator in the world to sell the Blackphone in its stores.
In the U.S., both the FBI and the National Security Agency have criticized efforts by companies such as Google and Apple to encrypt information on phones, saying it would hinder efforts to catch criminals and terrorists. At the conference, Baloo railed against such efforts.
“Democracy and civil rights should protect you with real standards,” she said. “Police and prosecutors should be able to substantiate why they want to eavesdrop on someone. If they cannot, they should not have that ability.”
The Dutch telco’s strong focus on privacy comes from hard experience, according to VentureBeat. In 2012, a hack forced KPN to shut down e-mail service to 2 million users.
Baloo, a security expert, was brought on board shortly after that shut-down to help shore up security. The partnership with Silent Circle started with an effort to improve security inside the company, and when the Blackphone was developed it seemed like a natural extension to offer it to KPN customers.
More telecom news from Europe, the Middle East and Africa:
Nokia offers $16.5 billion for Alcatel Lucent in merger deal.  The Finnish ICT company Nokia announced on April 15 that it has entered into a memorandum of understanding with France’s Alcatel-Lucent. Under the terms of the deal, Nokia will make an offer for all the equity securities issued by Alcatel-Lucent for a total value of $16.5 billion. On the same day, Nokia also announced that it would be initiating a “review of strategic options” for Here, its mapping and navigation unit. Options include possible divestment.
Telekom Slovenije stake sale receives only one bid. On April 13, Slovenia’s asset manager reported it had received only one binding offer for the telecom’s assets, without naming the bidder or the amount. However, sources told Bloomberg that the private equity firm Cinven had placed the bid. Telekom Slovenije is estimated to have a market value of $824 million.
MTN Group plans to buy Nigeria’s Visafone – report. Sources told Reuters that MTN Group is working on a deal to buy Visafone Communications in Nigeria, which is home to the largest mobile market in Africa. MTN is already the leading operator in Nigeria with a 44% market share, according to the Nigerian Communications Commission. The country’s rapid mobile growth is outpacing network investment, and recently Nigeria’s consumer protection agency threatened operators delivering poor services with criminal prosecution.
Want to know more? Check out our EMEA coverage, and follow me on Twitter

'Trek 10 procent IT-budget uit voor beveiliging'

'Trek 10 procent IT-budget uit voor beveiliging'

                                       
Nederlandse bedrijven geven meer geld uit aan de koffie-automaat dan aan het beveiligen van hun digitale infrastructuur. Daarin moet verandering komen vindt brancheorganisatie Nederland ICT. Volgens bestuurslid Rhett Oudkerk Pool zouden bedrijven en overheden zouden minimaal 10 procent van hun IT-budget moeten reserveren voor maatregelen om de cybersecurity te verbeteren.
Nederland kan wat Nederland ICT betreft voorop lopen in het vergroten van digitale veiligheid. "De bedreigingen zijn groot, maar de kansen zijn nog groter" stelt Oudkerk Pool, die in het dagelijks leven als CEO leiding geeft aan beveiligingsadviesbedrijf Kahuna.
Vandaag zijn politici, beleidsmakers, bedrijven en academici van over de hele wereld op de Global Conference on CyberSpace (GCCS) in Den Haag om te praten over online veiligheid en vertrouwen. Oudkerk Pool is ook op de conferentie aanwezig, vooral om cybersecurity binnen bedrijven op de agenda te zetten. "Het is natuurlijk goed dat er op wereldwijd niveau wordt gesproken over beleid op dit gebied," zegt hij, "maar laten we ook zorgen dat dit binnen organisaties wordt omgezet in praktische maatregelen."

Minimaal 10 procent van hun IT-budget

Deze week nog maakte beveiliger Symantec bekend dat Nederland op de vierde plaats staat van landen met de meeste cyberaanvallen. Dat heeft volgens Oudkerk vooral te maken met de positie van Nederland in de digitale wereld:  "Volgens het World Economic Forum staan we ook op de vierde plaats van landen die ICT goed weten te benutten. Digitale innovatie kan een enorme bijdrage leveren aan onze economie, maar daarvoor is het cruciaal dat er vertrouwen is en blijft in ICT."
Dat vraagt om blijvende aandacht voor cybersecurity bij organisaties, op strategisch niveau. Organisaties zouden dan ook minimaal 10 procent van hun IT-budget vrij moeten maken voor cybersecurity, vindt Oudkerk.

Responsible disclosure

De handreiking cybersecurity voor bestuurders die de Cyber Security Raad deze week publiceerde is wat Oudkerk betreft een mooi begin. "De handreiking is op initiatief van Nederland ICT-voorzitter Bart Hogendoorn samengesteld door experts en bevat praktische tips waarmee bestuurders meteen aan de slag kunnen."
Daarnaast liggen er volgens Oudkerk veel kansen in meer samenwerking tussen organisaties. "Neem responsible disclosure: het op een verantwoorde manier melden van beveiligingslekken bij bedrijven. Dit is door het Nationaal Centrum Terrorismebestrijding uitgeroepen tot een van de speerpunten van de GCCS. In Nederland hebben telecombedrijven in 2012 al de koppen bij elkaar gestoken om hier goede afspraken over te maken, onder andere met een meldpunt en een gedragscode. Dat was wereldwijd echt een primeur."

Real time auditing

Een ander gebied waar nog veel verbetering mogelijk is, is het monitoren van de digitale infrastructuur. "We moeten veel meer aan 'real time auditing' doen," zegt Oudkerk, "Dus eigenlijk constant monitoren waar er mogelijk gaten in de beveiliging zitten. Nu laten bedrijven dat hooguit één keer per jaar testen, maar dat kan makkelijk anders. Je controleert toch ook elke dag of je gebouw goed is afgesloten?"
Dat de ogen van de digitale wereld deze week gericht zijn op Den Haag is een uitgelezen kans, vindt Oudkerk. "Onze uitgangspositie is uitstekend, alle ingrediënten zijn aanwezig. Als we nu doorpakken kan Nederland wereldwijd leidend worden op het gebied van cybersecurity."


http://www.automatiseringgids.nl/nieuws/2015/16/nederland-ict-over-investeren-in-cyber-security?utm_source=ag_automatiseringgids&utm_medium=email&utm_term=AutomatiseringGids&utm_content='Nieuws%20-%20'Trek%2010%20procent%20IT-budget%20uit%20voor%20beveiliging'%20-%20Gevaarlijke%20iPhone-oplader%20moet%20ingeruild%20worden'%20&utm_campaign=ag_16042015

Monday, April 13, 2015

Most Dutch government websites do not use secure encrypted connections

Most Dutch government websites do not use secure encrypted connections


Only 19% of 2,000 Dutch government websites uses the HTTPS protocol for handling requests between a browser and the website that it is connected to. HTTPS makes it more difficult for third parties to track users.
httphttpsResearch by Open State Foundation shows that only 1 in 5 government websites can use an encrypted connection, with only 5 percent that forces encrypted web traffic. Open State Foundation looked at all 3889 Dutch government domains, excluding redirects, leaving 2,093 government websites. Of these, 413 government websites can use an encrypted HTTPS connection and only 120 (5%) government websites force the usage of HTTPS. Only 119 government websites use HSTS, a security feature that lets a web site tell browsers that it should only be communicated with using HTTPS, instead of using HTTP.
The importance of HTTPS
HTTPS (HTTP over SSL or HTTP Secure) is the use of Secure Socket Layer (SSL) or Transport Layer Security (TLS) as a sublayer under regular HTTP application layering. HTTPS encrypts and decrypts user page requests as well as the pages that are returned by the server.
Open State Foundation could not find an explanation why one government domain does and the other does not use HTTPS. The websites of the General Intelligence and Security Service (aivd.nl), the National Coordinator for Security and Counterterrorism (nctv.nl) and the National Cyber Security Centre (ncsc.nl) always use the HTTPS protocol, but the websites of the Dutch Tax Service (belastingdienst.nl), the Dutch Review Committee on the Intelligence and Security Services (ctivd.nl) and the Radiocommunications Agency Netherlands (agentschaptelecom.nl) do not encrypt all web traffic.
The low number of government websites that uses HTTPS is remarkable precisely because the National Cyber Security Center advices that all websites that handle sensitive data use HTTPS. Citizens expect anything they read on a government site to be official, and they expect any information they submit to that website to be sent safely and only to the government. They also expect government websites to be secure, trustworthy, and reliable.
Unconference on a secure, open and free internet
On Friday, April 17th, 2015 organizes OpenEstate Foundation together with the Ministry of Foreign Affairs an unconference, GCCS-Unplugged where the challenge is being addressed an open, free and secure web in an innovative way.
Data
Download the full list of the 3889 government domains and HTTPS test (zip file / csv 196 KB) here.


http://openstate.eu/2015/04/2176/

Sunday, April 12, 2015

The Two Biggest Cloud Security Concerns for 2015

The Two Biggest Cloud Security Concerns for 2015

While many companies would like to adopt cloud services, many still resist over concerns about data security. Here's how managed service providers (MSPs) can overcome their two main objections.
While many companies would like to adopt cloud services, many still resist over concerns about data security. Here's how managed service providers (MSPs) can overcome the two main objections to cloud computing and cloud-based file sharing in 2015.
As a recent article from CloudWedge says, “The most cited barrier to entry for cloud into the enterprise continues to be the security concerns involved with an infrastructure overhaul.” The problem with that lingering concern is that the enduring lack of education is hindering the market for MSPs. Yet, this knowledge also presents an opportunity.
What these hesitant or resistant organizations really fear is the unknown. And, what they don’t know is what adopting the cloud will mean for their most valuable, most highly-protected data.
A KPMG cloud security survey of IT executives reports that 45 percent of respondents cite data privacy and data loss as their top hesitations regarding cloud implementation. With that in mind, here are the two biggest objections MSPs must overcome in order to convert more leads in 2015:
How can my organization be sure our data is being kept safe?
It’s no wonder that people are concerned as to whether or not their data will be kept safe in the cloud. There have been high-profile cases of compromised data and information at large enterprises all over the world. And, says CloudWedge, “When data is lost, jobs are typically lost.”
MSPs need to assure organizations that their data, their information, their livelihood will be kept safe up above. It’s important to stress to organizations that the cloud is more secure now than it has ever been before. Plus, why should your customers place the burden of cloud security on their own teams when MSPs are able to offer their experienced, professional services for securing their data and information and jobs?
What if our organization’s data comes under attack?
Maybe the cloud isn’t leaking (raining?) companies’ private information unto the masses, but what if that data comes under attack? What if some cyber criminals discover the organization’s passwords? What if they find a way to delete private data and information, or worse: they “cloudjack” it?
MSPs can alleviate this concern by educating organizations about encryption and data backup. Show these customers and prospects that, above all, managing their cloud-based services means maintaining and even improving their data security.
Data security will always be a major concern of enterprises. But if you are prepared and address these concerns early, misconceptions about cloud security won't be a barrier to your business, they'll be an opportunity. What cloud sharing security concerns have you come across with your clients? Let us know in the comments.

Digital Certificates: Who Can You Trust?

10
9:26 am (UTC-7)   |    by

Digital certificates are the backbone of the Public Key Infrastructure (PKI), which is the basis of trust online. Digital certificates are often compared to signatures; we can trust a document because it has a signature, or certificate authority (CA) by someone we trust. Simply put, digital certificates are a reproduction of a simple model which occurs in the real world.
Incidents involving digital certificates have been in the news recently. Issues surrounding digital certificates and CAs are not always clear or noticeable to end users. However, IT managers, software developers, and other security professionals need to understand these problems so that the risks can be properly managed.
So who or what can we trust online?
Every computer connected to the Internet contains a list of trusted root CAs. These root CAs issue certificates, which can be used to either sign certificates for other CAs or to servers. There needs to be a “chain of trust” from any certificate that the system sees to any of the root certificates that it trusts.
What does “trusted” mean?
If a secure connection or signed file is “trusted”, this generally equates to an absence of warnings. Digital certificates are used to secure websites using SSL/TLS, identify and validate executable files using code signing, and secure email via Secure/Multipurpose Internet Mail Extensions (S/MIME). If a browser accesses an HTTPS server with an untrusted server certificate, it will generate a warning. If an unsigned or untrusted executable file is run, a warning message may be generated. A user may see these messages and avoid potentially risky behavior.
HTTPS is widely used as a way to assure users that connections to sites are authentic. Many users view the “green bars” that browsers use to mark HTTPS addresses as a sign that their connections are safe.
This trust is based on two things:
  1. CAs are not supposed to issue certificates to inappropriate users.
  2. Users (e.g. PC, browsers or mobile devices) should not add any inappropriate CA to the list of trusted CAs.
Unfortunately, the basis of this trust is now being challenged. Institutions and organizations that may not necessarily be trustworthy are widely thought of as such.
Here are several cases that highlight the problems, in today’s trust-based CA system.
Trust lost: CAs issuing certificates to inappropriate users
CAs need to have a good track record when it comes to securing their own systems to ensure they don’t issue improper certificates. However, they have been incidents where their own security and processes were targeted.
In 2011, an attacker calling himself the ComodoHacker was able to penetrate the systems of Diginotar, a Dutch CA. The attacker issued multiple fraudulent certificates. The loss of confidence in Diginotar’s security led to major operating system vendors removing them from their lists of trusted CAs. This eventually led to Diginotar shutting down as a business.
While Diginotar was a relatively minor player in the CA market, the attacker also claimed to have access to the networks of Comodo, which was a far larger CA.
More recent cases are just as troubling. In March 2015, Comodo issued a certificate for the live.fi domain to an unauthorized party. This domain was the Finnish domain of the live.com online services, which is a part of Microsoft. How was this the case?
Comodo issued what are known as Domain Validation certificates to Microsoft. These types of CA require the site owner to verify that he does control the domain he wants a CA for. The most common method is to send an email from that domain with one of several possible email addresses, namely:
  • admin@
  • administrator@
  • postmaster@
  • hostmaster@
  • webmaster@
The live.fi domain is used by Microsoft to provide free email… and a clever Finnish man found that the address hostmaster@live.fi was available. He was able to acquire this address and use it to obtain certificates for live.fi that would have been trusted by any browser, but was not under the control of Microsoft. In interviews, the Finnish man stated he had already reported the problem to Microsoft and Finnish authorities in January of 2015.
The certificate could have been used to mount man-in-the-middle (MITM) attacks with a certificate that would have been verified by any browser in use. This would have fooled many users into giving up their credentials. Comodo cancelled the certificate, and Microsoft released a separate update for Windows as well.
Fortunately, there was a silver lining. Only one fraudulent certificate was created, and it could not be used for other purposes. This is because the allowed uses are defined in the certificate (Extended Key Usage). Certificates for SSL servers can only be used for server verification; code signing certificates are limited to that specific purpose as well.
Later in the month, Google had their own certificate snafu. They discovered that an Egyptian ISP (MCS Holdings) held a digital certificate that could be used for MITM attacks via a proxy. Generally, these proxies require that a certificate be installed in devices to be transparent. However, in this case, the MCS Holdings certificate was signed by the China Internet Network Information Center (CNNIC), which was included in root stores. This means that any certificates issued with the MCS Holdings certificate would be seen as valid by systems, even if they had no “right” to issue that certificate (i.e., for domains they did not own).
As in the case of Diginotar, this incident has resulted in serious consequences for the CAs involved. The certificate issued by MCS Holdings has been blacklisted by Google, Microsoft, and Mozilla. In addition, CNNIC itself has been targeted for action as well.
Both Google and Mozilla have indicated that moving forward, certificates issued by CNNIC would no longer be trusted. This means that while organizations that currently rely on CNNIC-derived certificates can still use them, once these certificates expire, new certificates (with a different CA) will need to be acquired.
These cases highlight the inherent risk in a CA-based model: attacks targeting CAs can occur, and if these certificates fall into the hands of attackers with strong motives to intercept communications, user information could be put at risk. For the CAs themselves, any lapse in the process of how certificates are issued can result in a swift blacklisting, instantly ruining what can be a profitable business for the CA.
Mistrust added: inappropriate CAs among trusted CA lists
The most recent case of a CA being added to user systems was Superfish, where adware capable of monitoring HTTPS traffic was preinstalled on Lenovo PCs. Due to the (poor) way it was implemented, in a nutshell, anyone could issue any certificates.
This means on any PCs with Superfish installed, the trust placed in HTTPS might well be misplaced: phishing by making a malicious sites appear to be secure, intercepting communications using MITM attacks, signing malware so that users would run it, making users believe that signed mail was legitimate.
It is difficult, if not impossible, to use the Internet if there is an underlying current of distrust. If we can’t tell if our visits to Gmail, Facebook, Twitter or online banking sites are safe, we can’t even browse websites. If we can’t tell if the program we’re try to run is real or not, even opening what we think is Notepad entails a risk.
What should users and CAs do?
CAs need to ensure that their own systems are secure to minimize the possibility of fraudulent certificate issuance. They should also be careful about issuing certificates to widely recognized and popular domains, especially since the effects if such a wrongly issued certificate becomes public may be significant. A system based on trusting institutions (such as CAs) only functions well if the said institutions are actually trustworthy.
CAs should consider verifying via other means of communication (such as the phone) requests for certificates from domains, as this is a step cybercriminals may have difficulty meeting. Recipients of certificates with more privileges (such as the one issued to MCS) should be subject to tighter controls to prevent potential abuse.
Site owners who want to ensure that fraudulent certificates are not issued in their name should consider pre-registering the email addresses frequently used for website verification, to ensure they stay under the control of the site administrator. Alternately, those addresses should be set aside and not registered at all.
The current CA-based system of trust is not perfect, and relies on both certificate authorities and users to exercise good judgment and prudence. It is far from perfect, but it is what we have now. Additional safeguards can improve the security of SSL certificates. As more and more sites become HTTPS-only, this issue will become more prominent in the coming years.


Wednesday, April 1, 2015

Encryption and the Politics of Confusion

Encryption and the Politics of Confusion

Posted by on in Smart Encryption 
     

A couple of elected officials are finally learning about encryption. In the process, they’ve unleashed resolutions that are sometimes comical and more often chilling.
b2ap3_thumbnail_Aderholt_encryption_iPhone_CSPAN.jpg
No, encryption did not start with the iPhone: Rep. Robert Aderholt discusses data security with the FBI during a March subcommittee hearing.
A House of Representatives appropriations subcommittee recently chatted about crypto in a friendly session on a multi-million-dollar budget boost request by FBI Director James Comey. Rep. Robert Aderholt (R-Ala.) flashed his iPhone and talked about its tools as if it is the first device to ever have encryption. Later, he hinted at China’s savvy in asking Apple and others device makers for their own set of encryption keys for everything. Rep. Chaka Fattah (D-Pa.) said the subcommittee could find a balance between privacy and the dire needs of law enforcement by deferring to the “wisdom of Solomon” in insight from a judge. In a wishy-washy contribution, Rep. Mike Honda (D-Calif.) presented little insight from his Silicon Valley constituency. Rep. John Carter (R-Texas), a judge, conceded he knew little on the technological challenges of the instituting or accessing all encryption keys, though summarized that it creates the “perfect tool for lawlessness.”
Comey, a stated curmudgeon about advances in encryption, likened encryption to the world’s only impenetrable safe (where he’s not completely off) and invoked a hypothetical scenario where a crying mother castigated him for his inability to access a phone to find her missing daughter. Congressional head nodding ensued, especially after Comey later indicated this “hard” situation would be best solved by “a legislative fix.”
We don’t expect our elected officials to master every aspect of our lives and making sure American’s stay safe tends to rest atop their tasklist. As they discuss and occasionally bumble through the details of encryption keys and device protection, it’s hard for security providers (and anyone who’s used PGP over the last few decades) to keep from cringing at the possible government outcomes. It goes right to the top, too, as President Obama has recently dipped his toe in the cloudy waters of encryption as it relates to tech, privacy, protection and terror.
Absent from the recent subcommittee discussion were notions from business or even data security providers about the benefits to security, privacy and lawful protection already in use. And when the applause died down from Obama’s State of the Union declarations, most of the most important business, technology and private security voices did not join him at the microphone in his subsequent cybersecurity shindig. Maybe burned by snooping disclosures or uncertain how to move ahead, these leaders of private industry – who spend a good portion of their time working out ways to prevent breaches and protect information – seem to be left as passive or complicit partners when it comes to encryption. Couldn’t this instead be a moment for government to trust CIOs and the Googles of this country for a creative, secure way forward? (We explored a few of those angles in our latest episode of “Thieves, Snoops and Idiots.”)
Congressional and government leaders have shown there is a significant learning curve over existing technological capabilities and the extent of the law. More directly as it relates to our personal conversations and billions of daily business transactions, elected officials see an answer in more legislation, compliance and backdoors. Before Obama or Aderholt dive into a copy of “Cryptography for Dummies” to solve this problem, he and his colleagues would be well-served to truly invite technological and business partners into the conversation on paths to potentially expand security and protection.
Last modified on

PCI SSC FAQ on impending revisions to PCI DSS, PA-DSS to address SSL protocol vulnerability



PCI SSC FAQ on impending revisions to PCI DSS, PA-DSS to address SSL protocol vulnerability



25 March 2015

*Note: This document provides additional information to support the "PCI SSC bulletin on impending revisions to PCI DSS, PA-DSS", 13 February 2015.

Why does SSL need to be removed as an example of "strong cryptography" from the PCI DSS and PA-DSS?

The National Institute of Standards and Technology (NIST) has identified the Secure Socket Layers (SSL) v3.0 protocol (a cryptographic protocol designed to provide secure communications over a computer network) as not being acceptable for the protection of data due to inherent weaknesses within the protocol. Because of these weaknesses, no version of the SSL protocol meets the PCI Security Standards Council (PCI SSC) definition of "strong cryptography," and revisions to the PCI Data Security Standard (PCI DSS) and the Payment Application Data Security Standard (PA-DSS) are necessary. The successor protocol to SSL is TLS (Transport Layer Security) and its most current version as of this publication is TLS 1.2. TLS 1.2 currently meets the PCI SSC definition of "strong cryptography".

How does it pose a risk to payment card data?

The SSL protocol vulnerability primarily affects web servers and browsers, so if exploited it can jeopardize the security of any payment card data being accepted or processed. Upgrading to a current, secure version of TLS, the successor protocol to SSL, is the only known way to remediate the SSL vulnerabilities which have been most recently exploited by browser attacks including POODLE and BEAST.

How does it impact the security of PIN Transaction Security (PTS) Point-of-Interaction (POI) terminals?

PTS POI terminals (devices such as a magnetic card readers or chip card readers that enable a consumer to use a payment card to make a purchase) can be impacted if the software on these terminals is communicating using the SSL protocol. As known vulnerabilities are difficult to exploit in this environment, the Council considers this a lower priority risk compared to web servers and browsers. Organizations will need to remain up-to-date with vulnerability trends to determine whether or not they are susceptible to any known exploits. New threats and risks must continue to be managed in accordance with applicable PCI DSS Requirements, such as 6.1, 6.2, and 11.2.

Which requirements does it potentially impact?

The changes impact all requirements in the PCI DSS and PA-DSS that reference SSL as an example of "strong cryptography". Specifically:

PCI DSS Requirements 2.2.3, 2.3 and 4.1

PA-DSS Requirements 6.2, 8.2, 11.1 and 12.1-12.2



Which documents will be affected?

As these are revisions to the standards, all PCI DSS and PA-DSS v3.0 documentation will be affected, including: Self-Assessment Questionnaires (SAQ), Attestation of Compliance (AOC), Report on Compliance (ROC), Attestation of Validation (AOV) and Report on Validation (ROV).

When will the PCI DSS and PA-DSS 3.1 revisions be published?

The Council plans to publish the revision for PCI DSS in April, with the PA-DSS revision to follow shortly after.

When published, the revisions will be effective immediately but impacted requirements will have a sunset date to allow for organizations with affected systems to implement the changes. The Council will provide guidance on risk mitigation approaches to be applied during the migration process. For PA-DSS v3.1, the Council is also looking at how to address both future submissions and currently listed applications. The revised standards will be accompanied by a summary of changes document for each standard, as well as supporting guidance to help clarify the impact of these changes,

including interim risk mitigation approaches, migration recommendations and alternative options for strong cryptographic protocols.

How is the Council recommending organizations address this vulnerability?

Per the PCI SSC Bulletin on Impending Revisions to PCI DSS, PA-DSS, the Council urges organizations to work with your IT departments and/or partners to understand if and how your systems are using SSL and to determine available options for upgrading to at least TLS 1.1 or higher as soon as possible.



When publishing the revisions, the Council will also provide guidance and educational webinars on the use of interim risk mitigation approaches, migration recommendations and alternative options for strong cryptographic protocols.

***
https://www.pcisecuritystandards.org/pdfs/15_03_25_PCI_SSC_FAQ_SSL_Protocol_Vulnerability_Revisions_to_PCI_DSS_PAD.pdf


Evolving PKI Standards

Evolving PKI Standards

 


0
Part of 6 in the Series — The Evolution of Public Key Infrastructure
The World of PKI Standards
Public Key Infrastructure (PKI) facilitates security services across a global community of users and across various applications. As such, standards are a key requirement for the success of PKI. Without standards for PKI data structures, trust management and interoperability, the use of PKI would be severely limited and may never have experienced the success it has over the past 20-25 years.
Public Key Infrastructure (PKI) standards have evolved significantly over the past 25+ years. The initial standard was a relatively simple one, defined as a common base to support multiple applications. As individual applications and user communities adopted PKI technology, they fed additional requirements into the standards bodies, resulting in a number of updates to that base standard. In addition, some applications that have specific unique requirements have developed their own standards or standard profiles for use in their specific application. The ICAO electronic passport standard is one recent example.
What’s in a Standard?
All PKI standards created over these years cover at least the following main subjects:
• Certificate content and format
• Certificate revocation
• Trust models
All public-key certificates contain at least a public-key, identification of the subject/holder of that key, identification of the Certification Authority (CA) that issued the certificate and time period during which the certificate can be used. All certificates are digitally signed by the issuing Certification Authority (CA). Some certificates are issued to end-entities, such as human users or devices. Other certificates are issued to CAs, enabling certification paths to be constructed between a locally trusted CA and the remote CA that issued the certificate of interest.
Certificate revocation is a mechanism to shorten the validity period after a certificate has been issued. If, for example, an employee leaves the company, or the corresponding private key is compromised, the CA will likely want to revoke the corresponding certificate for the remainder of the published validity period. In some environments, rather than perform certificate revocation, CAs issue only certificates with very short initial validity periods, eliminating the need to revoke them.
PKI standards support a number of trust models including hierarchical, distributed, bridge etc. The appropriate trust model depends largely on the application in which PKI is being deployed or the community of users that will be taking part.
X.509 – The Base Standard for PKI
The initial “base standard” for PKI, developed jointly by the International Telecommunication Union (ITU) and International Organization for Standardization (ISO) was published in 1988. This standard, commonly referred to by its ITU identifier “X.509” (ISO identifier is 9594-8) defined a simple format for public-key certificates that enabled them to be distributed widely and enabled users to verify their authenticity and integrity. The certificates, digitally signed by the issuing Certification Authority (CA) contain a public-key, the identity of the key holder and the validity period of the certificate. The CA also regularly publishes a digitally signed Certificate Revocation List (CRL) identifying certificates that had to be revoked prior to the end of their validity period. As certificates can be issued to CAs as well as end-entities, certification paths can be constructed and any of the trust models supported. The original driver for this standard was the ITU-T X.400 secure messaging standard. However the base standard was defined to support any application, not just the X.400 requirements.
Establishing Trust Among User Communities
In the 1990s, as more communities of users began to adopt PKI, and the requirement to establish trust among user communities emerged, a number of requirements for enhancement to the base X.509 standard were identified. The U.S. Federal Government, for example, established a PKI infrastructure to connect the various government departments together and to bridge that integrated infrastructure with external PKI user communities. Trust needed to be manageable and constraints to control the boundaries of the interconnected user community needed to be possible. A set of extensions was added in the 3rd edition of X.509 (1997). Certificate extensions included constraints on the use of the public-key, constraints on the certification paths that could be constructed, alternative names associated with the certificate issuer and the certificate subject, link from the certificate to the appropriate CRL etc. CRL extensions included the ability to partition the set of revoked certificates across a set of smaller CRLs, identification of the set of certificates covered by a specific CRL, ability to issue interim delta CRLs and the ability to identify a CRL issuer other than the CA that issued the certificates. This milestone was the only time that the actual certificate and CRL structures were modified. Because of the addition of an optional field for the extensions, the versions were updated to version 3 certificates and version 2 CRLs. The extension mechanism is flexible and enables new features to be added to certificates and CRLs in a backward compatible way. Although there have been several updates to the X.509 standard the versions for certificate and CRL formats remain the same today. This consistency in versions is critical to interoperability and any future enhancements to the standards must also be done within the framework of existing versions.
The Internet PKI
Although the X.509 standard can be used in various applications, it makes no statement about which features/extensions should be mandatory or optional. The Internet Engineering Task Force (IETF) saw the need for specific interoperability profiles tailored for Internet applications. The IETF profile for “The Internet PKI” has, in fact, become adopted globally as the de facto base profile for certificates and CRLs far beyond its original intent. One reason for this is likely it is a general purpose profile intended to support generic Internet applications and at least provides some guidance on mandatory requirements, while not being over-restrictive or application-specific. Another contribution reason may be that IETF RFCs are available free over the Internet while the current version of the X.509 standard must be purchased. This IETF specification has itself undergone 2 revisions. The current specification is RFC 5280 published in 2008. Previous versions were RFC 2459 published in 1999, and RFC 3280 published in 2002. The IETF also defined some application-specific profiles for Internet applications such as secure messaging (S/MIME) and Transport Layer Security (TLS) supporting the authentication of websites. RFCs were also defined for PKI operational protocols including certificate management protocols, as well as Online Certificate Status Protocol (OCSP), an alternative to CRLs for revocation status checking.
Passports & Smartcards
Other standards groups and industry forums have developed specifications for their own environments. Generally these specifications tend to be based on the X.509 base standard as well as the IETF profiles and additional protocols. One example is the International Civil Aviation Authority (ICAO) and its specifications for electronic passports. In that application, a digitally signed copy of data about the document, the document issuer and the document holder is stored on a contactless IC within the passport. The associated certificates and CRLs are made available to border control so the digital signature can be verified and the data validated. For border control, this makes uncovering impersonation more reliable and identification of fraudulent documents easier. The PKI supporting this ICAO application is quite simple. Each country operates a single CA and that CA signs certificates for Document Signers that sign the passport data. There are no intermediate CAs, eliminating the need for most certificate extensions and simplifying the validation process significantly.
In addition to the base X.509 certificate format, some different formats have been defined to satisfy specific requirements where X.509 isn’t the best fit. One example is the ISO 7816 card-verifiable certificate format. This format better suits environments where certificates will be processed on chips in smart cards, passports etc. These certificates tend to be smaller than X.509 certificates, have shorter validity periods and have no revocation scheme. One application where this certificate format is being used is the authorization of access rights to sensitive fingerprint biometrics on travel documents within the European Union. Because certificate validation is done on the passport chip, their existing ISO 7816 file system and library can be re-used and additional X.509 validation library is not required. Given that these chips have much more restrictive processing capabilities than a computer typically used for validation in an X.509 PKI environment, use of the ISO 7816 PKI rather than X.509 is critical.
With terrorism a high priority for governments globally, secure passports continue to be a focus area and enhancements to the standards are currently being developed that will provide additional support to border control. These enhancements will provide electronic access to additional data, including travel entry and exit stamps, electronic visas and additional biometrics on passport chips. Data integrity and authenticity will be supported using X.509 PKI. Read and write permissions will be managed through an ISO 7816-based PKI authorization infrastructure.
We’ll be covering of PKI in government in more detail in a future blog in our Evolution of PKI series. Meanwhile our next blog will take a closer look at the role of PKI in Authentication.
Sharon Boeyen