The European Commission is due to present a proposal on cybersecurity in February once it has received feedback from the European Parliament and EU countries.
The proposal was initially announced in May for the third quarter this year but has been delayed.
EU moves to protect critical infrastructure echo similar concerns worldwide amid an increasing number of cyber attacks globally that can disrupt important areas of the economy, from online banking to stock exchanges.
"Minimum security requirements should also apply to public administrations and operators of critical information infrastructure to promote a culture of risk management and ensure that the most serious incidents are reported," the report said.
Unlike the United States where companies are required to report online attacks, which supporters say forces companies into keeping cyber defences tight, the EU has a piecemeal approach.
Some countries like Britain oppose mandatory reporting, which it believes would encourage companies to cover up online breaches because they do not want to alarm their customers.
An EU official said the aim of the report was to get companies to be more open about cyber attacks and help them fend off such disruption.
"We want to change the culture around cyber security from one where people are sometimes afraid or ashamed to admit a problem, to one where authorities and network owners are better able to work together to maximise security," the official said.
European companies in critical areas of the economy "lack effective incentives to provide reliable data on the existence or impact" of network security incidents, the report said.
Companies fear that revealing their vulnerability could cost them customers, but authorities are eager for increased transparency to try and shut down methods hackers use to exploit networks before they can do widespread damage.
"Cyber security incidents are increasing at an alarming pace and could disrupt the supply of essential services we take for granted such as water, sanitation, electricity, or mobile networks," the report said.
The EU proposal would require companies in critical infrastructure areas to conduct risk assessments and work with national authorities to ensure a minimum standard across the 27-country bloc.
Inconsistent measures on cyber security also carry an economic cost. In 2012, 38% of the EU's internet users say they were concerned about making payments online, an EU poll showed.
Monday, December 31, 2012
To Encrypt Email or Not to Encrypt Email? Practical Answers to a Question That Is Surprisingly Complex
by
13
inShare
inShare
[author: Elizabeth H. Johnson]
Health care providers frequently ask us whether they have to encrypt emails, particularly those sent to patients who have asked for an emailed copy of their health records. Since patients have a right to receive electronic copies of their health records, emailing them a copy when they ask for it seems like the right thing to do.
Unfortunately, the decision actually is more complicated. HIPAA requires that all electronic transmissions of protected health information (PHI) be encrypted. That means ALL of them … fax, email, web-based and otherwise. The requirement applies regardless of the identity of the recipient or patient, and the recipient cannot “undo” or waive the requirement by consenting to the receipt of unencrypted emails.
Because encryption can be expensive and technically difficult to implement, HIPAA does include an exception of sorts. You may hear the encryption requirements and certain other HIPAA security requirements referred to as “addressable” — that designation means the requirements can be treated as optional if the following “exception” applies.
Before it can decide not to encrypt electronic transmissions of PHI, a HIPAA-covered entity must engage in a feasibility analysis. The analysis must consider each of the following factors:
- Size, complexity and capabilities of the covered entity
- Covered entity’s technical infrastructure, hardware and software security capabilities
- Costs of security measures
- Probability and criticality of potential risks to electronic PHI
If encryption proves to be too expensive and difficult in comparison to the covered entity’s size and capabilities and seems to add little value to the overall security of PHI, then HIPAA allows the covered entity to forego encryption. HIPAA requires, however, that the covered entity implement an equivalent, alternative security measure if it is “reasonable and appropriate” to do so (as determined using the same analysis described above), which invites covered entities to evaluate several options to determine which is the most appropriate. In all cases, the covered entity must document its analysis and decisions.
One more time in English? Health care providers are allowed to send PHI in unencrypted emails but only after they engage in the analysis described above and document their determination. It is a violation of the HIPAA Security Rule to send unencrypted emails containing PHI without first having performed and documented that analysis. A single violation can carry a penalty as high as $50,000, a useful figure to contemplate if you think encryption is too expensive to implement. Encryption also carries the benefit of qualifying for a “safe harbor” under HIPAA’s breach notification requirements. A security incident that would otherwise require notification is not considered a breach if the PHI affected were encrypted and the encryption key has not been compromised.
Other solutions besides encrypted email may be more prudent when patients request copies of their records. For example, physician practices that maintain patient portals are able to provide patients with access to records and (often) communications through a secure website. Another alternative is to ask patients to come by and pick up a copy of their electronic health record on a password-protected CD. It’s also appropriate to check some form of ID when patients arrive to pick up the records. Share the password with them only after verifying their identity.
For patients who simply insist on receiving email, if that email cannot be encrypted then a health care provider may be left with two unappealing choices. Choice one is to refuse, in which case patients may rightly insist that the provider has not respected their right (guaranteed by the HITECH Act) to receive a copy of their electronic health records. Choice two is to fulfill the request, send the unencrypted email, and risk violating the HIPAA Security Rule. We think the better choice is to send the email, but only after the health care provider engages in the required feasibility analysis and documents the outcome as described above to help ensure Security Rule compliance. It’s also a good idea to advise patients of the potential risks and insecure nature of email, and then ask again if they really want the record sent in that manner. To be clear, however, the patients’ informed consent to the insecure nature of email does not alleviate the provider’s requirement to engage in the feasibility analysis.
If unencrypted email is still the choice of both the provider and the patient, then as a precautionary move health care providers should consider sending records via email by appending a password-protected file. Once sent, the provider can call the patient to share the password by phone. That follow-up phone call facilitates two things: first, the provider can confirm that the patient actually received the email; and second, the password is not disclosed until the patient has received the records. If the email is accidentally sent to the wrong place, the recipient is less likely to be able to open the file and view health information without the password. Although taking these measures will not meet HIPAA’s encryption requirement (providers seeking an exception still have to do the feasibility analysis), they do minimize the likelihood of an inadvertent disclosure that could anger patients and lead to a HIPAA complaint.
Thursday, December 27, 2012
Top 10 risks found by your auditor
KirkpatrickPrice offers a list of the most common risks they find.
1. No formal policies and procedures
Formal guidelines of policies and procedures help provide your employees with clarity of what’s expected of them. They define the accountability for each employee and also establish necessary training. Information security policies are mandated by the FTC Safeguards Rule, PCI Data Security Standards, and the HIPAA Security Rule. This means they are mandatory.
2. Misconfigurations
Standards need to be applied consistently. Organizations should utilize benchmark configuration standards from a recognized entity such as: Center for Internet Security (CIS), International Organization for Standardization (ISO), SysAdmin Audit Network Security (SANS) Institute, and the National Institute of Standards Technology (NIST).
3. No formal risk assessment
Assessment should cover assets that are critical to your enterprise to continue business operations for the following: hardware, software, human resources, and processes (automated or manual). Some important things to consider when thinking about risk assessment are the threats to your assets as well as the likelihood of vulnerability being compromised. Threats can be both internal (employees or third party contractors or partners) as well as external (natural events or social engineering). Developing a proper risk assessment can help to mitigate potential risks that you face.
4. Undefined incident response
It is always important to have clear instructions on reporting procedures when determining incident response. It is suggested to build a culture within your work environment that encourages reporting of all incidents the moment they present themselves.
5. Lack of disaster planning
Disaster planning is important in a situation where written plans were available for others to follow in the event that key personnel are not available. A business impact analysis can help quantify what level of redundancy is required for disaster planning. Proactive arrangements should be made to care for the staff and to communicate with third parties. Walkthroughs and training scenarios can benefit organizations so employees are properly prepared in the event of a disaster.
6. Lack of testing
The concept of testing applies to all areas of your security. If your security is not tested, there is no way to determine whether or not vulnerabilities are present.
7. Insecure code
Developing secure coding is something we find lots of companies struggling with. To develop secure coding, training must be implemented as well as specific development standards and quality assurance.
8. Lack of monitoring/audit trails
Log Harvesting, parsing, and alerting methods must be determined to efficiently deal with massive event logs. The responsibility for review must be formally assigned as part of daily operations. Audit trails should be stored in such a way that system administrators cannot modify without alerting someone with and oversight role.
9. Data leakage
Some things we often forget are where the data is located and how long should it be retained? How is encryption implemented and verified? How is access to data granted and audited? These things are all very important and if not corrected, can keep you from complying with federal and industry standards and regulations.
10. Lack of training
A lack of training can prove to be a striking blow to the security of your organization. Employers should recognize the importance of properly training all employees on safety and security best practices. Standards and guidelines should be clearly set and determined in each organization. Several training opportunities are offered through KirkpatrickPrice to properly train you and your company on the basics of security awareness, awareness for managers, awareness for IT professionals, and awareness for credit card handling.
1. No formal policies and procedures
Formal guidelines of policies and procedures help provide your employees with clarity of what’s expected of them. They define the accountability for each employee and also establish necessary training. Information security policies are mandated by the FTC Safeguards Rule, PCI Data Security Standards, and the HIPAA Security Rule. This means they are mandatory.
2. Misconfigurations
Standards need to be applied consistently. Organizations should utilize benchmark configuration standards from a recognized entity such as: Center for Internet Security (CIS), International Organization for Standardization (ISO), SysAdmin Audit Network Security (SANS) Institute, and the National Institute of Standards Technology (NIST).
3. No formal risk assessment
Assessment should cover assets that are critical to your enterprise to continue business operations for the following: hardware, software, human resources, and processes (automated or manual). Some important things to consider when thinking about risk assessment are the threats to your assets as well as the likelihood of vulnerability being compromised. Threats can be both internal (employees or third party contractors or partners) as well as external (natural events or social engineering). Developing a proper risk assessment can help to mitigate potential risks that you face.
4. Undefined incident response
It is always important to have clear instructions on reporting procedures when determining incident response. It is suggested to build a culture within your work environment that encourages reporting of all incidents the moment they present themselves.
5. Lack of disaster planning
Disaster planning is important in a situation where written plans were available for others to follow in the event that key personnel are not available. A business impact analysis can help quantify what level of redundancy is required for disaster planning. Proactive arrangements should be made to care for the staff and to communicate with third parties. Walkthroughs and training scenarios can benefit organizations so employees are properly prepared in the event of a disaster.
6. Lack of testing
The concept of testing applies to all areas of your security. If your security is not tested, there is no way to determine whether or not vulnerabilities are present.
7. Insecure code
Developing secure coding is something we find lots of companies struggling with. To develop secure coding, training must be implemented as well as specific development standards and quality assurance.
8. Lack of monitoring/audit trails
Log Harvesting, parsing, and alerting methods must be determined to efficiently deal with massive event logs. The responsibility for review must be formally assigned as part of daily operations. Audit trails should be stored in such a way that system administrators cannot modify without alerting someone with and oversight role.
9. Data leakage
Some things we often forget are where the data is located and how long should it be retained? How is encryption implemented and verified? How is access to data granted and audited? These things are all very important and if not corrected, can keep you from complying with federal and industry standards and regulations.
10. Lack of training
A lack of training can prove to be a striking blow to the security of your organization. Employers should recognize the importance of properly training all employees on safety and security best practices. Standards and guidelines should be clearly set and determined in each organization. Several training opportunities are offered through KirkpatrickPrice to properly train you and your company on the basics of security awareness, awareness for managers, awareness for IT professionals, and awareness for credit card handling.
Saturday, December 22, 2012
Hackers Steal Data from Pentagon, NASA, Federal Reserve
Ben Weitzenkorn, TechNewsDaily Staff Writer
Date: 12 December 2012 Time: 01:53 PM ET
Members of the Anonymous-affiliated Team GhostShell hacking collective have published what they claim is stolen information for 1.6 million accounts linked to government agencies, including the Pentagon, NASA and the Federal Reserve. The hackers appear to have breached the database with a malicious SQL code injection, ZDNet reported, stealing passwords and corresponding email addresses, phone numbers, home addresses and notes from defense tests.
"#ProjectWhiteFox will conclude this year's series of attacks by promoting hacktivism worldwide and drawing attention to the freedom of information on the net," Team GhostShell wrote in a Pastebin post that included links to the stolen information.
Team GhostShell gained notoriety when they leaked information from more than 100 websites, including those of the Thai Navy and MIT. The politically minded hackers made headlines again when they hacked 100 prestigious universities and leaked 120,000 records to protest what they called the deteriorating quality of education. This latest breach is another blow to NASA, where computer security breaches have occurred with embarrassing frequency over the past two years. The space agency said it had lost more than 48 portable devices in addition to laptops stolen from employee vehicles in March 2011 and October of this year.
Team GhostShell may be in hacking hibernation for now, but it's almost certain that the activist hackers will return in 2013.
"Happy holidays and who knows, maybe we'll see each other again next year, the hackers wrote. They signed it, "GhostShell."
This story was provided by TechNewsDaily, a sister site to SPACE.com. Follow Ben on Twitter @benkwx
"#ProjectWhiteFox will conclude this year's series of attacks by promoting hacktivism worldwide and drawing attention to the freedom of information on the net," Team GhostShell wrote in a Pastebin post that included links to the stolen information.
Team GhostShell gained notoriety when they leaked information from more than 100 websites, including those of the Thai Navy and MIT. The politically minded hackers made headlines again when they hacked 100 prestigious universities and leaked 120,000 records to protest what they called the deteriorating quality of education. This latest breach is another blow to NASA, where computer security breaches have occurred with embarrassing frequency over the past two years. The space agency said it had lost more than 48 portable devices in addition to laptops stolen from employee vehicles in March 2011 and October of this year.
Team GhostShell may be in hacking hibernation for now, but it's almost certain that the activist hackers will return in 2013.
"Happy holidays and who knows, maybe we'll see each other again next year, the hackers wrote. They signed it, "GhostShell."
This story was provided by TechNewsDaily, a sister site to SPACE.com. Follow Ben on Twitter @benkwx
Friday, December 21, 2012
ElcomSoft Decrypts BitLocker, PGP and TrueCrypt Containers
ElcomSoft Decrypts BitLocker, PGP and TrueCrypt Containers
BitLocker, PGP and TrueCrypt set industry standard in the area of whole-disk and partition encryption. All three tools provide strong, reliable protection, and offer a perfect implementation of strong crypto.
Normally, information stored in any of these containers is impossible to retrieve without knowing the original plain-text password protecting the encrypted volume. The very nature of these crypto containers suggests that their target audience is likely to select long, complex passwords that won’t be easy to guess or brute-force. And this is exactly the weakness we’ve targeted in our new product: Elcomsoft Forensic Disk Decryptor.
The Weakness of Crypto Containers
The main and only weakness of crypto containers is human factor. Weak passwords aside, encrypted volumes must be mounted for the user to have on-the-fly access to encrypted data. No one likes typing their long, complex passwords every time they need to read or write a file. As a result, keys used to encrypt and decrypt data that’s being written or read from protected volumes are kept readily accessible in the computer’s operating memory. Obviously, what’s kept readily accessible can be retrieved near instantly by a third-party tool. Such as Elcomsoft Forensic Disk Decryptor.
Retrieving Decryption Keys
In order to access the content of encrypted containers, we must retrieve the appropriate decryption keys. Elcomsoft Forensic Disk Decryptor can obtain these keys from memory dumps captured with one of the many forensic tools or acquired during a FireWire attack. If the computer is off, Elcomsoft Forensic Disk Decryptor can retrieve decryption keys from a hibernation file. It’s important that encrypted volumes are mounted at the time a memory dump is obtained or the PC goes to sleep; otherwise, the decryption keys are destroyed and the content of encrypted volumes cannot be decrypted without knowing the original plain-text password.
“The new product includes algorithms allowing us to analyze dumps of computers’ volatile memory, locating areas that contain the decryption keys. Sometimes the keys are discovered by analyzing byte sequences, and sometimes by examining crypto containers’ internal structures. When searching for PGP keys, the user can significantly speed up the process if the exact encryption algorithm is known.”
It is essential to note that Elcomsoft Forensic Disk Decryptor extracts all the keys from a memory dump at once, so if there is more than one crypto container in the system, there is no need to re-process the memory dump.
Using forensic software for taking snapshots of computers’ memory is nothing new. The FireWire attack method existed for many years, but for some reason it’s not widely known. This method is described in detail in many sources such as http://www.securityresearch.at/publications/windows7_firewire_physical_attacks.pdf or http://www.hermann-uwe.de/blog/physical-memory-attacks-via-firewire-dma-part-1-overview-and-mitigation
The FireWire attack method is based on a known security issue that impacts FireWire / i.LINK / IEEE 1394 links. One can take direct control of a PC or laptop operating memory (RAM) by connecting through a FireWire. After that, grabbing a full memory dump takes only a few minutes. What made it possible is a feature of the original FireWide/IEEE 1394 specification allowing unrestricted access to PC’s physical memory for external FireWire devices. Direct Memory Access (DMA) is used to provide that access. As this is DMA, the exploit is going to work regardless of whether the target PC is locked or even logged on. There’s no way to protect a PC against this threat except explicitly disabling FireWire drivers. The vulnerability exists for as long as the system is running. There are many free tools available to carry on this attack, so Elcomsoft Forensic Disk Decryptor does not include a module to perform one.
If the computer is turned off, there are still chances that the decryption keys can be retrieved from the computer’s hibernation file. Elcomsoft Forensic Disk Decryptor comes with a module analyzing hibernation files and retrieving decryption keys to protected volumes.
Complete Decryption and On-the-Fly Access
With decryption keys handy, Elcomsoft Forensic Disk Decryptor can go ahead and unlock the protected disks. There are two different modes available. In complete decryption mode, the product will decrypt everything stored in the container, including any hidden volumes. This mode is useful for collecting the most evidence, time permitting.
In real-time access mode, Elcomsoft Forensic Disk Decryptor mounts encrypted containers as drive letters, enabling quick random access to encrypted data. In this mode files are decrypted on-the-fly at the time they are read from the disk. Real-time access comes handy when investigators are short on time (which is almost always the case).
We are also adding True Crypt and Bitlocker To Go plugins to Elcomsoft Distributed Password Recovery, enabling the product to attack plain-text passwords protecting the encrypted containers with a range of advanced attacks including dictionary, mask and permutation attacks in addition to brute-force.
Unique Features
The unique feature of Elcomsoft Forensic Disk Decryptor is the ability to mount encrypted disks as a drive letter, using any and all forensic tools to quickly access the data. This may not seem secure, and may not be allowed by some policies, but sometimes the speed and convenience is everything. When you don’t have the time to spend hours decrypting the entire crypto container, simply mount the disk and run your analysis tools for quick results!
More Information
More information about Elcomsoft Forensic Disk Decryptor is available on the official product page at http://www.elcomsoft.com/efdd.html
ElcomSoft Decrypts BitLocker, PGP and TrueCrypt Containers | |
December 20th, 2012 by Vladimir Katalov |
BitLocker, PGP and TrueCrypt set industry standard in the area of whole-disk and partition encryption. All three tools provide strong, reliable protection, and offer a perfect implementation of strong crypto.
Normally, information stored in any of these containers is impossible to retrieve without knowing the original plain-text password protecting the encrypted volume. The very nature of these crypto containers suggests that their target audience is likely to select long, complex passwords that won’t be easy to guess or brute-force. And this is exactly the weakness we’ve targeted in our new product: Elcomsoft Forensic Disk Decryptor.
The Weakness of Crypto Containers
The main and only weakness of crypto containers is human factor. Weak passwords aside, encrypted volumes must be mounted for the user to have on-the-fly access to encrypted data. No one likes typing their long, complex passwords every time they need to read or write a file. As a result, keys used to encrypt and decrypt data that’s being written or read from protected volumes are kept readily accessible in the computer’s operating memory. Obviously, what’s kept readily accessible can be retrieved near instantly by a third-party tool. Such as Elcomsoft Forensic Disk Decryptor.
Retrieving Decryption Keys
In order to access the content of encrypted containers, we must retrieve the appropriate decryption keys. Elcomsoft Forensic Disk Decryptor can obtain these keys from memory dumps captured with one of the many forensic tools or acquired during a FireWire attack. If the computer is off, Elcomsoft Forensic Disk Decryptor can retrieve decryption keys from a hibernation file. It’s important that encrypted volumes are mounted at the time a memory dump is obtained or the PC goes to sleep; otherwise, the decryption keys are destroyed and the content of encrypted volumes cannot be decrypted without knowing the original plain-text password.
“The new product includes algorithms allowing us to analyze dumps of computers’ volatile memory, locating areas that contain the decryption keys. Sometimes the keys are discovered by analyzing byte sequences, and sometimes by examining crypto containers’ internal structures. When searching for PGP keys, the user can significantly speed up the process if the exact encryption algorithm is known.”
It is essential to note that Elcomsoft Forensic Disk Decryptor extracts all the keys from a memory dump at once, so if there is more than one crypto container in the system, there is no need to re-process the memory dump.
Using forensic software for taking snapshots of computers’ memory is nothing new. The FireWire attack method existed for many years, but for some reason it’s not widely known. This method is described in detail in many sources such as http://www.securityresearch.at/publications/windows7_firewire_physical_attacks.pdf or http://www.hermann-uwe.de/blog/physical-memory-attacks-via-firewire-dma-part-1-overview-and-mitigation
The FireWire attack method is based on a known security issue that impacts FireWire / i.LINK / IEEE 1394 links. One can take direct control of a PC or laptop operating memory (RAM) by connecting through a FireWire. After that, grabbing a full memory dump takes only a few minutes. What made it possible is a feature of the original FireWide/IEEE 1394 specification allowing unrestricted access to PC’s physical memory for external FireWire devices. Direct Memory Access (DMA) is used to provide that access. As this is DMA, the exploit is going to work regardless of whether the target PC is locked or even logged on. There’s no way to protect a PC against this threat except explicitly disabling FireWire drivers. The vulnerability exists for as long as the system is running. There are many free tools available to carry on this attack, so Elcomsoft Forensic Disk Decryptor does not include a module to perform one.
If the computer is turned off, there are still chances that the decryption keys can be retrieved from the computer’s hibernation file. Elcomsoft Forensic Disk Decryptor comes with a module analyzing hibernation files and retrieving decryption keys to protected volumes.
Complete Decryption and On-the-Fly Access
With decryption keys handy, Elcomsoft Forensic Disk Decryptor can go ahead and unlock the protected disks. There are two different modes available. In complete decryption mode, the product will decrypt everything stored in the container, including any hidden volumes. This mode is useful for collecting the most evidence, time permitting.
In real-time access mode, Elcomsoft Forensic Disk Decryptor mounts encrypted containers as drive letters, enabling quick random access to encrypted data. In this mode files are decrypted on-the-fly at the time they are read from the disk. Real-time access comes handy when investigators are short on time (which is almost always the case).
We are also adding True Crypt and Bitlocker To Go plugins to Elcomsoft Distributed Password Recovery, enabling the product to attack plain-text passwords protecting the encrypted containers with a range of advanced attacks including dictionary, mask and permutation attacks in addition to brute-force.
Unique Features
The unique feature of Elcomsoft Forensic Disk Decryptor is the ability to mount encrypted disks as a drive letter, using any and all forensic tools to quickly access the data. This may not seem secure, and may not be allowed by some policies, but sometimes the speed and convenience is everything. When you don’t have the time to spend hours decrypting the entire crypto container, simply mount the disk and run your analysis tools for quick results!
More Information
More information about Elcomsoft Forensic Disk Decryptor is available on the official product page at http://www.elcomsoft.com/efdd.html
This entry was posted on Thursday, December 20th, 2012 at 10:54 am
Thursday, December 20, 2012
EU to propose mandatory reporting of cyber incidents | EurActiv
EU to propose mandatory reporting of cyber incidents | EurActiv
The European Union may force companies operating critical infrastructure in areas such as banking, energy and stock exchanges to report major online attacks and reveal security breaches, according to a draft report by the European Commission.
Next steps:
- Feb. 2013: European Commission expected to table proposal on mandatory reporting of cyber incidents.
EurActiv.com with Reuters
COMMENTS
- ALL companies that process citizens' personal details should be required to disclose breaches immediately. Is it not unethical to prioritise and privilege companies' concerns abou the impact on their market share over the privacy, dignity and personal security of citizens?By :jlodge- Posted on :18/12/2012
- Back in 2004/5 I did a report on Critical Infrastructure one part of it looked at CERTs (Computer Emergency Response Teams) which are supposed to respond to “cyber attacks”. Telcos and other large organisations have them, as do banks. Think of a CERT as a “fire-brigade” – external (professional ones) and internal ones. There are also private CERTS.
I spoke to an external private “CERT”. They specialised in banks and were a fund of entertaining stories of what goes wrong. One massive German bank suffered a serious DDOS attack and were unable to handle it (despite the bank having thrown considerable amounts of money at their internal CERT). So the bank CERT called in the real experts – who cracked the problem in short order (or so the external CERT claimed). Here’s the kicker, the internal CERT did not tell anybody (e.g. the main board) that they had to pull in outside help – and pretended that they cracked the problem themselves – which in a sense they did. If the internal CERT is a bit coy telling upper management of a problem (despite a requirement to do so) – why would they tell anybody else. The private CERT claimed the problem was endemic.
In the case of “real” critical infrastructure, such as control of power transmission networks – back in 2005 there were few problems. Generally speaking these were run on wholly private networks. However, as one UK guy noted, “the suits want more information” i.e. non-engineering managerial types want more systems operation data. This can lead to openings in systems that were previously wholly closed to “the Internet”.
The UK, as usual is talking bollocks. They already have a confidential/invitation only group (of utilities) which meets on a regular basis (attendance is obligatory) to exchange info on attacks etc.
Basically it comes down to this: keep customer facing systems (the retail operation as it were) wholly separate from network operation systems unless you want entertainment of the sort you will regret later. This is not hard to do – but usually costs a bit more – which means that the suits in the interests of economy will probably try and converge the two systems. So when the lights go off or the gas fails – you know who to blame – some half wit in a suit who thought he could save money for his utility.By :Mike Parr - Posted on : 19/12/2012
Tufin is rethinking enterprise security with an application-centric model
Summary: Managing security policies is a big headache for large organizations...
By Tom Foremski for Tom Foremski: IMHO |
I recently spoke with Ruvi Kitow, CEO and co-founder of Tufin Technologies, which provides firewall policy management tools for very large companies.
Tufin is interesting because it is rethinking the way firewalls should be managed. And it's because of rise in the number of applications being produced by enterprises.
Firewall administrators are spending more of their time dealing with application related change requests. Yet the app developers know little about firewalls and potential conflicts, or security holes. Earlier this year, Tufin launched SecureApp, a suite of admin tools to help manage this important security relationship between apps and firewalls.
This application centric approach is a different way of thinking about enterprise security. Here are some notes from our conversation:
- Our latest survey shows that nearly half of all firewall changes are related to application connectivity. And most companies report that they don't have confidence in their IT staff being able to fully address the compliance and security risks that arise when managing application connectivity.
- We realized that the best approach is to address security through the app layer first, to document the resources an app needs and how it behaves, and then to communicate what's needed in the firewalls. Our new product helps to automate this process. And it's integrated with our two other firewall products, SecureTrack and SecureChange.
- It's a paradigm shift and it might take some time for this to be understood but you have to tackle security first through the app layer not the network.
- Here's why: Large enterprises have a high degree of complexity because of multiple locations, multiple IT systems and hundreds of firewalls to manage with multitudes of rules. Developing new applications is tough because they must work across all of a corporations firewalls.
- Here's why: Large enterprises have a high degree of complexity because of multiple locations, multiple IT systems and hundreds of firewalls to manage with multitudes of rules. Developing new applications is tough because they must work across all of a corporations firewalls.
- The situation becomes more complex when changes are made to an application and those changes have to be communicated to hundreds of firewalls.
- If firewalls aren't configured right, apps will fail. But the apps developers have to be better at communication, and documenting, how their apps behave, so that the right changes can be made by the network security teams.
- Anytime you change the configuration of firewalls, other things can break. A key feature of SecureApp is that you can simulate the entire network and test changes safely.
- CIOs want to deploy apps faster but this can compromise security if there is little communication between the app developers and the security teams.
- There's often a cultural problem within large corporations in that the apps developers don't understand the security issues and the security people don't understand the apps.
-There is often little or no documentation, and when people leave a company, a lot of knowledge about an app leaves too. SecureApp makes sure that there is documentation and that knowledge isn't lost when people leave.
Foremski's Take: Tufin's application centric approach to improving enterprise security makes sense and it won't take corporations long to realize its the right approach.
What will take longer is the internal shift in culture, in the app developer teams, which traditionally have not been very security minded.
It's a leadership move by Tufin and one that's well timed. The explosion of apps in the consumer web is driving a tremendous amount of app development in the enterprise. Firewalls can quickly become brick walls or leave security holes open because of badly designed apps.
Managing hundreds of firewalls while trying to support a deluge of apps will quickly turn into a nightmare unless the whole process of application development can be mapped against an organization's firewalls. The app and the firewall have to be in sync and that requires new sets of tools.
Tools such as Tufin's not only provide an easy interface for managing security policies and compliance but they can also be used as an agent of cultural change within organizations because they offer a common ground for the apps and security teams. It helps them communicate with each other, which should lead to better apps.
Topic: Enterprise Software
Monday, December 17, 2012
Wednesday, December 12, 2012
Increasing cloud adoption puts enterprises at risk
Increasing cloud adoption puts enterprises at risk
December 12, 2012
Enterprises are running one-third of their mission-critical applications in the cloud today and expect to have half of all critical applications running in the cloud by 2015, according to SailPoint.
In many cases, IT organizations are not fully aware of which cloud applications are in use across the enterprise, which makes it more difficult than ever for enterprises to monitor and control user access to mission-critical applications and data. In fact, only 34% of companies bring IT staff into the vendor selection and planning process when a cloud application is procured without using IT's budget, making it very difficult to proactively address security and compliance requirements for those applications.
SailPoint's survey found that business users have gained more autonomy to deploy cloud applications without IT involvement, yet they do not feel responsible for managing access control. In fact, 70% of business leaders believe that IT is ultimately responsible for managing user access to cloud applications. Adding to IT’s challenge, more than 14% of business leaders admit they have no way of knowing if sensitive data is stored in the cloud at all. This lack of visibility and control greatly increases an organizations risk of security breaches, exposure to insider threats and failed audits.
"As organizations adopt cloud applications, they are very likely to increase their risk exposure by putting sensitive data in the cloud without adequate controls or security processes in place," said Jackie Gilbert, VP and GM of SailPoint's Cloud Business Unit. "And this year's survey illustrates how 'at risk' companies already are. Many companies lack visibility not only to what data is in the cloud, but also to who can access that data. It's imperative that companies put in place the right monitoring and controls to mitigate these growing risks."
The consumerization of IT has led to employees taking advantage of new technologies, but will require organizations to evolve their identity and access management processes. For example, while work-based policies such BYOD give business users the flexibility to use their own mobile devices, those very same mobile devices are being used to access corporate applications in more than 95% of cases.
The ability for users to access corporate applications and data outside of the corporate network puts identity and access management under further strain because IT must now account for user access from a wider variety of devices not completely under their control.
This "consumerization" phenomenon is not only affecting devices but also applications, as many corporate employees are moving beyond BYOD to "bring your own application" (BYOA). BYOA means that today's business users are much more comfortable using consumer or “non-approved” applications for work activities. Less than a third of companies are fully locked down when it comes to application usage at work, which means that these activities frequently take place outside the purview of IT.
Alarmingly, the trend also extends to employees using the same passwords for a variety of accounts spanning their personal and professional lives. About half of the business leaders surveyed stated they frequently use the same password for personal web applications as they do for sensitive work applications. This exposes enterprises to new risks and security vulnerabilities should any of those personal applications experience a security breach.
"For the third year in a row, our Market Pulse Survey shows that the majority of large companies remain very concerned about security breaches and their ability to meet regulatory compliance requirements," said Kevin Cunningham, president of SailPoint. "This is due in part to the ever changing IT landscape that make existing identity management issues even larger. The consumerization of IT has put enterprises in a difficult position: they want to provide business users the convenience and flexibility promised by cloud and mobile devices, but they must also make sure controls are in place to monitor and manage who has access to what. Regardless of where customers are with their IAM strategy, they need to proactively consider how to govern these new technologies and behaviors within their corporate policies."
December 12, 2012
Enterprises are running one-third of their mission-critical applications in the cloud today and expect to have half of all critical applications running in the cloud by 2015, according to SailPoint.
In many cases, IT organizations are not fully aware of which cloud applications are in use across the enterprise, which makes it more difficult than ever for enterprises to monitor and control user access to mission-critical applications and data. In fact, only 34% of companies bring IT staff into the vendor selection and planning process when a cloud application is procured without using IT's budget, making it very difficult to proactively address security and compliance requirements for those applications.
SailPoint's survey found that business users have gained more autonomy to deploy cloud applications without IT involvement, yet they do not feel responsible for managing access control. In fact, 70% of business leaders believe that IT is ultimately responsible for managing user access to cloud applications. Adding to IT’s challenge, more than 14% of business leaders admit they have no way of knowing if sensitive data is stored in the cloud at all. This lack of visibility and control greatly increases an organizations risk of security breaches, exposure to insider threats and failed audits.
"As organizations adopt cloud applications, they are very likely to increase their risk exposure by putting sensitive data in the cloud without adequate controls or security processes in place," said Jackie Gilbert, VP and GM of SailPoint's Cloud Business Unit. "And this year's survey illustrates how 'at risk' companies already are. Many companies lack visibility not only to what data is in the cloud, but also to who can access that data. It's imperative that companies put in place the right monitoring and controls to mitigate these growing risks."
The consumerization of IT has led to employees taking advantage of new technologies, but will require organizations to evolve their identity and access management processes. For example, while work-based policies such BYOD give business users the flexibility to use their own mobile devices, those very same mobile devices are being used to access corporate applications in more than 95% of cases.
The ability for users to access corporate applications and data outside of the corporate network puts identity and access management under further strain because IT must now account for user access from a wider variety of devices not completely under their control.
This "consumerization" phenomenon is not only affecting devices but also applications, as many corporate employees are moving beyond BYOD to "bring your own application" (BYOA). BYOA means that today's business users are much more comfortable using consumer or “non-approved” applications for work activities. Less than a third of companies are fully locked down when it comes to application usage at work, which means that these activities frequently take place outside the purview of IT.
Alarmingly, the trend also extends to employees using the same passwords for a variety of accounts spanning their personal and professional lives. About half of the business leaders surveyed stated they frequently use the same password for personal web applications as they do for sensitive work applications. This exposes enterprises to new risks and security vulnerabilities should any of those personal applications experience a security breach.
"For the third year in a row, our Market Pulse Survey shows that the majority of large companies remain very concerned about security breaches and their ability to meet regulatory compliance requirements," said Kevin Cunningham, president of SailPoint. "This is due in part to the ever changing IT landscape that make existing identity management issues even larger. The consumerization of IT has put enterprises in a difficult position: they want to provide business users the convenience and flexibility promised by cloud and mobile devices, but they must also make sure controls are in place to monitor and manage who has access to what. Regardless of where customers are with their IAM strategy, they need to proactively consider how to govern these new technologies and behaviors within their corporate policies."
Friday, December 7, 2012
- Information Security in the Era of Big Data - PKWARE Blog
Information Security in the Era of Big Data
The modern business world is faced by the challenge of ever-expanding data volumes. While mountains of information are normally associated with analytics trends such as big data, the amount of digital content in general is rapidly expanding. According to IDC's predictions for 2012, the total amount of digitally stored content will reach 2.7 zettabytes (2.7 billion terabytes) by the end of the year. This represents a 48 percent increase from 2011, and total volume is expected to reach 8 ZB by 2015.
IDC's projections signal a growing data migration to mobile and cloud solutions, with third-party technologies expected to drive 20 percent of IT spending in the coming years. Particularly in emerging markets, mobile devices will play a significant role in the lives of technology professionals.
"As the number of intelligent communicating devices on the network will outnumber 'traditional computing' devices by almost two to one, the way people think about interacting with each other, and with devices on the network, will change," an IDC report stated.
Preparing for the data shift
There are numerous data security questions that must be answered as companies grapple with these trends. Decision makers must be careful in the planning stages to ensure technology partners follow security best practices. Even then, migrating data to the cloud represents a significant risk because the business loses control over the protections used to guard its information, but will still be held responsible if the vendor's system is breached - or when a negligent insider compromises cloud-stored data. Particularly in the era of big data, with information itself a valuable asset, businesses would be wise to keep some control over their data security postures.
Writing for Dark Reading, the CTO of security consulting firm Securosis, Adrian Lane, made note of several security practices that are used in today's technology architecture. Data encryption software is the best solution for protecting files at-rest and archival information. This technology makes critical data unreadable to any third party without the encryption key. Lane noted that it is important to leverage a comprehensive mixture of perimeter-based defenses such as network security and data-centric solutions such as file encryption. This will become increasingly important as big data architectures become more common, since most analytics solutions do not provide data protection functionality by themselves.Information Security in the Era of Big Data - PKWARE Blog
IDC's projections signal a growing data migration to mobile and cloud solutions, with third-party technologies expected to drive 20 percent of IT spending in the coming years. Particularly in emerging markets, mobile devices will play a significant role in the lives of technology professionals.
"As the number of intelligent communicating devices on the network will outnumber 'traditional computing' devices by almost two to one, the way people think about interacting with each other, and with devices on the network, will change," an IDC report stated.
Preparing for the data shift
There are numerous data security questions that must be answered as companies grapple with these trends. Decision makers must be careful in the planning stages to ensure technology partners follow security best practices. Even then, migrating data to the cloud represents a significant risk because the business loses control over the protections used to guard its information, but will still be held responsible if the vendor's system is breached - or when a negligent insider compromises cloud-stored data. Particularly in the era of big data, with information itself a valuable asset, businesses would be wise to keep some control over their data security postures.
Writing for Dark Reading, the CTO of security consulting firm Securosis, Adrian Lane, made note of several security practices that are used in today's technology architecture. Data encryption software is the best solution for protecting files at-rest and archival information. This technology makes critical data unreadable to any third party without the encryption key. Lane noted that it is important to leverage a comprehensive mixture of perimeter-based defenses such as network security and data-centric solutions such as file encryption. This will become increasingly important as big data architectures become more common, since most analytics solutions do not provide data protection functionality by themselves.Information Security in the Era of Big Data - PKWARE Blog
Monday, December 3, 2012
Bank Agrees to Reimburse Hacking Victim $300K in Precedent-Setting Case
In a case watched closely by banks and their commercial customers, a financial institution in Maine has agreed to reimburse a construction company $345,000 that was lost to hackers after a court ruled that the bank’s security practices were “commercially unreasonable.”
People’s United Bank has agreed to pay Patco Construction Company all the money it lost to hackers in 2009, plus about $45,000 in interest, after intruders installed malware on Patco’s computers and stole its banking credentials to siphon money from its account.
Patco had argued that the bank’s authentication system was inadequate and that it failed to contact the customer after its automated system flagged the transactions as suspicious. But the bank maintained that it had done due diligence because it verified that the ID and password used for the transactions were authentic.
The case raised important questions about how much security banks and other financial institutions should be reasonably required to provide commercial customers.
Small and medium-sized businesses around the country have lost hundreds of millions of dollars in recent years to similar thefts, known as fraudulent ACH (Automated Clearing House) transfers, after their computers were infected with malware that swiped their bank account credentials. Some have been lucky to recover the money from banks that valued their business, but others, like Patco, were told by their banks that they were responsible for the loss.
Although the assets of customers with personal bank accounts are protected under federal law, commercial bank accounts are not. The only recourse such customers have when their bank refuses to assume responsibility for stolen funds is to try to pursue their money in state courts under the Uniform Commercial Code.
People’s United Bank agreed to the settlement only after an appellate court indicated that the bank’s security system and practices had been inadequate under the UCC.
“This case says to banks and to commercial customers … that there are circumstances in which the bank cannot shift the risk of loss back to the customer, and we’re not going to assume that security procedures are commercially reasonable just because the bank has a system that they say is state of the art,” says attorney Dan Mitchell, who represented Patco.
Last year, a U.S. District Court in Maine ruled that People’s United Bank wasn’t responsible for the lost money, and granted the bank’s motions for a summary dismissal of Patco’s complaint. A magistrate agreed with the ruling saying in part that although the bank’s security procedures “were not optimal,” they were comparable to those offered by other banks.
But judges with the First Circuit Court of Appeals ruled last July that the bank’s security system wasn’t “commercially reasonable,” (.pdf) and advised the two parties to try to come to a settlement, which they did about a week ago. Patco will not be reimbursed attorneys fees in the settlement.
Patco, a family-owned business in Sanford Maine, sued Ocean Bank, which is owned by People’s United Bank, after discovering in May 2009 that hackers were siphoning about $100,000 per day from its online bank account. The hackers had sent a malicious e-mail to employees that allowed them to surreptitiously install the Zeus password-stealing trojan on an employee computer.
After obtaining Patco’s banking credentials and waiting for its account to fill up with money, the hackers used the credentials to initiate a series of electronic money transfers over seven days. Nearly $600,000 worth of transfers were made out of the account via six transactions before Patco realized it had been hacked.
Ocean Bank, after being notified of the fraud, was able to block about $240,000 in transfers. But Patco was unable to retrieve the rest.
Patco, which had been banking with Ocean Bank for 24 years, sued the bank for failing to notice the fraudulent activity and stop it, saying that its security system was not “commercially reasonable” under the Uniform Commercial Code. Under Article 4A of the code, a bank receiving a payment order generally bears the loss of any unauthorized requests for fund transfers. The code also maintains that the “burden of making available commercially reasonable security procedures” belongs to the bank, because they “generally determine what security procedures can be used and are in the best position to evaluate the efficacy of procedures offered to customers to combat fraud.”
Patco maintained that the bank’s security system was inadequate and that the bank did not comply with its own security procedures.
Although the bank’s security system flagged the transactions as unusually “high-risk” because the timing, value and geographical location of the transactions were inconsistent with the pattern of other transactions Patco had made, the bank didn’t notice the alerts and let the transfers go through without notifying Patco.
Patco generally only made transfers once a week on Fridays, to make payroll payments, and the company made them from computers housed in its offices in Maine, which all used the same IP address. The most it ever transferred was about $36,000. Most of the fraudulent transactions were made in amounts exceeding $90,000, and they were initiated from different IP addresses. The money was also transferred to multiple people who had never received payments from Patco before. The fraudulent activity was caught only after some of the transactions were sent to bank accounts that didn’t exist, causing the transfer to fail. When Patco was notified about the failed transactions, they determined the transactions had never been authorized.
Patco accused the bank of failing to implement “best” security practices, such as requiring customers to use multifactor authentication.
The bank used a system called NetTeller, made by Jack Henry & Associates, a firm that works with numerous banks. Jack Henry uses the same system for 1,300 of its 1,500 bank customers. The system offers a number of authentication options, but the bank rejected most of them, and also configured the system in a way that made it more risky for customers like Patco.
“They had a decent system, but they configured it improperly and they didn’t use it properly,” says Mitchell.
Although the system used challenge questions to ferret out fraudsters, the system only used three security questions, and asked one or more of them at every transaction Patco made. Because the hackers had installed keystroke logging software onto Patco’s computers, they were able to record not only the user name and password for the account, but the responses to the three security questions that Patco employees set up for the account.
The appellate court ruled that the bank had substantially increased the risk of fraud by asking the security questions with every transaction and that this, in conjunction with a number of other failures, rendered the security system unreasonable.
Although the UCC places some burden on the customer to “exercise ordering care,” the court found that it was unclear what obligations a customer had when the bank’s security system was found to be commercially unreasonable.
Patco is not the first company to sue its bank over fraudulent money transfers. Experi-Metal sued its bank, Comerica, in 2009 after losing more than $550,000 in fraudulent wire transfers. Other cases are wending their way through courts around the country.
In 2010, the FBI disrupted a multinational cybertheft ring involving fraudulent ACH transfers. The thieves, using the Zeus malware, targeted small and medium-sized businesses, municipalities, churches and individuals. The scammers were able to steal more than $70 million from victims.
Security experts debate moving critical infrastructure online
Security experts debate moving critical infrastructure online
Paul Simmonds, Co-Founder of The Jericho Forum, has suggested that companies attempting to reduce costs by moving critical systems online could be opening themselves up to cyber attacks. Speaking at the Cybergeddon 2012 event, Mr Simmonds’ comments were echoed by other security experts – citing the discovery of highly advanced malware this year as a reason for greater caution.
This comes shortly after a researcher at security firm Exodus Intelligence discovered 23 vulnerabilities in industrial control systems from a variety of manufacturers, and the identification of further SCADA application vulnerabilities by Italian security company ReVuln last week.
Paul Davis, Director of Europe at FireEye has made the following comments:
The message is clear – when it comes to critical infrastructure, extreme vigilance is needed when taking the leap of faith into the online world, and cost saving cannot be the cause of any premature decision making. As our world becomes increasingly connected, with the internet controlling more aspects of daily life – the change needs to be reflected in the way that we think about security.
The security implications of Internet of Things are enormous, and are still widely misunderstood. However, while data loss and fraud are terrible outcomes of a breach, an intrusion on our control systems could have significantly more devastating consequences.
For SCADA systems in particular, it is essential that the security of the management platforms behind them is absolutely bulletproof – as any web-based attack on these systems would first have to penetrate this layer before moving on to the final target. As such, rapid detect and response solutions must be in place to thwart any threats immediately – and as evidenced by the calibre of malware being discovered today, traditional security tools simply do not go deep enough.
The rate at which international cybercrime is evolving has created a very steep learning curve for us all. GCHQ and other government organisations are doing a good job of publicising their efforts to boost collaboration, funding and overall cyber readiness initiatives – and hopefully with the right investment in the most appropriate defences, we will be well on our way to becoming a centre of cyber security excellence.
Paul Simmonds, Co-Founder of The Jericho Forum, has suggested that companies attempting to reduce costs by moving critical systems online could be opening themselves up to cyber attacks. Speaking at the Cybergeddon 2012 event, Mr Simmonds’ comments were echoed by other security experts – citing the discovery of highly advanced malware this year as a reason for greater caution.
This comes shortly after a researcher at security firm Exodus Intelligence discovered 23 vulnerabilities in industrial control systems from a variety of manufacturers, and the identification of further SCADA application vulnerabilities by Italian security company ReVuln last week.
Paul Davis, Director of Europe at FireEye has made the following comments:
The message is clear – when it comes to critical infrastructure, extreme vigilance is needed when taking the leap of faith into the online world, and cost saving cannot be the cause of any premature decision making. As our world becomes increasingly connected, with the internet controlling more aspects of daily life – the change needs to be reflected in the way that we think about security.
The security implications of Internet of Things are enormous, and are still widely misunderstood. However, while data loss and fraud are terrible outcomes of a breach, an intrusion on our control systems could have significantly more devastating consequences.
For SCADA systems in particular, it is essential that the security of the management platforms behind them is absolutely bulletproof – as any web-based attack on these systems would first have to penetrate this layer before moving on to the final target. As such, rapid detect and response solutions must be in place to thwart any threats immediately – and as evidenced by the calibre of malware being discovered today, traditional security tools simply do not go deep enough.
The rate at which international cybercrime is evolving has created a very steep learning curve for us all. GCHQ and other government organisations are doing a good job of publicising their efforts to boost collaboration, funding and overall cyber readiness initiatives – and hopefully with the right investment in the most appropriate defences, we will be well on our way to becoming a centre of cyber security excellence.
Subscribe to:
Posts (Atom)