Thursday, March 17, 2011

Some thoughts on RFID Privacy Protection

Key points for privacy

·         Minimize sensitive data on the chip.  Store only necessary data.
·        Use a Faraday Cage – something to protect against unauthorized signal penetration.
·         Encrypt the data. Only those authorized know encryption algorithm.

One major protection option: The use of jackets for passports, credit cards, ID cards, etc. There are plastic shields that are an electromagnetically opaque sleeve available.

Verayo employs their PUF Technology for secure access ti the RFID. The PUF technology adds multiple challenge response system for access that can store 264 possibilities. The challenge response changes after each use.

References:

Unclonable RFIDs

Physical Unclonable Functions

RFID tags

Privacy Preserving Data in Data Mining

What is Data Mining

Data mining looks for patterns, associations, and trends.  In this discovery process, personal and sensitive data are sometimes inadvertently exposed, or extracted through potential covert actions intruding into the mining process.  Maximizing data privacy begins with asking the question, ‘what data is actually needed for analysis. 

What is Data Is Necessary in Data Mining

Depending on the industry, personal information has more or less meaning.  In healthcare or education, the name is important.  The number of times a person buys a certain laundry detergent does not require a name association.  Data may cross several databases form different organizations.  Sensitive data, such as salary, is pertinent to accounting, yet the health condition is relevant to a doctor, not accounting.  Data of interest may be a set of attributes associated with events, persons, or places.  When the intent of a data analysis requires collaboration, how much data should be visible to the collaborating agencies? 

Data Mining Process

Techniques for privacy differ according to the distribution of the data.  Centralized data, typically in corporate/health databases, use perturbation privacy techniques.  Privacy data mining performs their analysis using data transformed prior to data mining, i.e., the age 45 would now be 40 to 49 possible represented by the numeric value of 4.  Entering ‘noise’, techniques that blur actual data recognition help to hide data, i.e., false data, additional data.  Distributed data, horizontal partitioned and vertical partitioned, typically associated with cross-database analyses, employ cryptographic privacy techniques.  Transferring encrypted data removing or hiding sensitive data in some fashion minimizes the capability of any intrusion.

The Model-Building Process

Mining data is primarily an inductive learning process.  Finding patterns, identifying classes/grouping of attributes are focuses of the process.  The model developed by the data mining process must be adaptive; the focus can be steered for a different set of patterns, etc.
Through supervised control, the model process provides insight into trends, concentrations of specified attributes, etc.  Through examples, the model process mines through the data discovering potential patterns not necessarily considered.  Machine learning and statistical methods are major players in data mining analysis.  The primary purpose of the data mining process is to uncover information from very large amounts of data.

Approaches to Privacy in Data Mining

The data mining analytical process of large data sets.  The process does not care what the data is.  The results of the process provides information to make decisions or review status an organization’s products and services.  Since this process is essentially blind to data, preprocessing and/or hiding sensitive and personal data is necessary.  Answering, ‘what is the need?  What is the Outcome?
There are three technique used for preserving privacy.  Algorithms use any combination to maximize privacy: association, classification, and clustering.  Association rule mining discovers occurrences that happen together, i.e., ‘if this then that’ tend to occur 40% of the time.  (Dunham, 2000)  Accepting the 40% as a rule the actual data occurrence is not physically part of the data analysis.  Classification Mining determines rules of association that classify the data (Dutt, 2005).  For example, number of years working in a career, a doctorate, and experience teaching is a top candidate for CTU professor position.  Someone with only a Masters would be a candidate for undergraduate courses.  Classification Mining is an input to machine learning, neural networks.  Data for analysis may only view the classifications.  Data mining employing Clustering Mining would see sensitive data replaced by categories.  These approaches work for centralized data.  In distributed data analysis, the addition of encrypted techniques provides additional privacy. 

Wednesday, March 9, 2011

Privacy Preserving Data in Data Mining


Introduction

Data mining looks for patterns, associations, and trends.  In this discovery process, personal and sensitive data are sometimes inadvertently exposed, or extracted through potential covert actions intruding into the mining process.  Maximizing data privacy begins with asking the question, ‘what data is actually needed for analysis.  There are two general approaches to preserving privacy in Data Mining: Perturbation and Cryptographic.  

Techniques for privacy differ according to the distribution of the data.  Centralized data, typically in corporate/health databases, use perturbation privacy techniques.  Privacy data mining performs their analysis using data transformed prior to data mining, i.e., the age 45 would now be 40 to 49 possible represented by the numeric value of 4.  Entering ‘noise’, techniques that blur actual data recognition help to hide data.  Distributed data, horizontal partitioned and vertical partitioned, typically associated with cross-database analysis, employ cryptographic privacy techniques.  Transferring encrypted data removing or hiding sensitive data in some fashion minimizes the capability of any intrusion.

What is Data Mining

·         Modeling:
o   Initial Exploration of Data
o   Model building or Pattern identification w/ Validation/Verification
o   Deployment

·         The Model-Building Process
o   Inductive Learning
o    Finding patterns
o    Identifies classes
o    Model must be adaptive

o    Supervised Learning – definition
OR
o    Unsupervised Learning – examples

·         Methods Used
o   Statistical Methods
o   Machine Learning

·         Where Does Privacy Preservation Come In?
o   Process Analytically versus Safeguarding Sensitive Data
o   What is the need?
Types of Information to Protect in Data Mining

·         Personal information
·         Sensitive information
·         Collaboration among different agencies

Privacy-Preserving Technology

·         Data Mining Algorithm Classification
o   Privacy-Preserving Association Rule Mining
o   Privacy-Preserving Classification Mining
o   Privacy-Preserving Clustering Mining

·         Random Data Perturbation Methodologies
o   Centralized Data
o   Add the random noise to confidential numerical attributes.
o   Guarantees no complete disclosure
o   Still possible - partial disclosure

·         Cryptography-Based Methodologies
o   Distributed Data
o   Secure Multiparty

References:

Acquisti, Alessandro, Gritzalis, Stefanos, Lambrinoudakis, Costas, De Capitani Di Vimercati, Sabrina (2008).  “Digital Privacy, Theory, Technologies, and Practices.”  Auerback Publications. Taylor & Francis Group, LLC

Evfimievski, Alexandre, Grandison, Tyrone.Privacy (2009) “Preserving Data Mining.”  IBM Almaden Research Center.

Shen, Yanguang, Han, Junrui, Shao, Hui (2009)Research on Privacy-Preserving Technology of Data Mining.”  IEEE, Second International Conference on Intelligent Computation Technology and Automation. 

Tuesday, March 1, 2011

BS Chapter 11. Forcing Firms to Focus: Is Secure Software in Your Future?

 “Software Vendors Give Us What We Want but Not What We Need” (Beautiful Security)

All requirements for a application are important, explicit as well as implicit. Certainly a question becomes, ‘what is the cost; what makes the cost effective for a profitable package?’ Explicit requirements are obvious: those that fulfill a customer need. Implicit is not something always considered other than added ‘bells and whistles’. The author offers three examples of how implicit requirements can be powerful: The Burger, The Book, and The Software Application.  The vendor, if not acceptable to the buyer as advertised, replaces the burger and the book, in order to maintain the reputation of the vendor. In software application, the implicit understanding has to do with functioning as advertised. Security, while an implicit requirement in today’s world, does not provide any additional functionality. In fact, the customer may not find the existence of a security flaw for some time after purchasing that application package.


There are a number of security-vulnerability detection tools.  “There is no standard way of assessing software security.” Many of the available assessment are to uncover vulnerabilities from a specific perspective, such as SQL/database, Web services, or a specific type of malware.  In addition, the threats continue to change. The changes in security features to software are only as protective as the latest set of vulnerability attributes. This presents a question, ‘Can security measures depend on a model of security threats that can capture the majority of intrusions techniques and methods?’

“Unsurprisingly, recent trends in security threat data clearly show the migration of hacker exploits away from network perimeters (routers, switches, firewalls, etc.) to the web application layer.” 

A good developer checks out the functionality of their package before presenting that application on the market. Subsystem tests and system tests verify functionality and the product integrates all the functions without errors. This begins with initial design before any development begins. Changes after the start often incur increase costs, causing the potential for project shut down. This holds true for any change. “… 50% of software vulnerabilities are based on defects in the architecture, not the code itself.” Including security as part of the initial design can address security problems in the software early in the lifecycle

The more complex a design may mean more resources and more cost to develop. The complexity of security intrusions adds complexity in the software design.  Certainly planning in a design for known places that open the doors for intrusions is an important design step. There are a growing number of tools to help discover some of the loopholes in the design. A couple a noted below:
 ·         CLASP (Comprehensive, Lightweight Application Security Process)
·         Verify

While static tools a important consideration should be given to software that interacts with the newly developed software. Verify is one package available. Verify provides useful information for the interfaced software vendors, and a level of trust for an organizations software application.


The author suggests two fundamental reasons for security controls in the software development process:
·         Security flaws are likely to be discovered in design, in the security and authorization architecture, in the actual software code, and in the configuration settings of the devices supporting the application. Therefore, multiple controls at different phases are essential.
·         Identification and remediation of vulnerabilities earlier in the software lifecycle lowers operating costs.

References and Some Interesting Articles:

 Top 10 2010-A1-Injection

http://www.owasp.org/index.php/Top_10_2010-A1-Injection

Verify

McAfee, Inc. Research Reveals Mothers Rate Cyber Dangers as High as Drunk Driving or Experimenting With Drugs

http://www.owasp.org/index.php/Main_Page

Oram, Andy, Viega, John. Beautiful Security. Publisher: O'Reilly Media, Inc., Pub. Date: April 28, 2009

 


Monday, February 28, 2011

DP – Chapter 01 Privacy Enhancing Technology

Chapter 1 of Digital Privacy is about privacy enhancing technology. Looking at Email privacy processes are the result of common sense based on understanding how an email travels over networks and into harms-way of those wanting information about the senders.

The first thought is, ‘how to make the senders identity anonymous?’ The Type 0, initial technique incorporates the concept of remailers removing the senders ID and replacing with a pseudonym. The problem occurs when someone obtains a list of pseudonyms, which uncovers the sender real identity.

The next step was chaining with encrypted data, Type I. Since a number of emails are traveling across a proxy server, the order changed so that the first email packet in was not the automatic packet out. Traffic analysis uncovered size relationships and are pieces of information to trace the originator.

To obviate this traffic analysis, to make an analysis more complex, the packets were sized equally in the Type II techniques. Type III added additional techniques to provide more anonymity.

There are a number of interactive anonymity and pseudonymity Systems. Pipenet began, but the original concept shut down the entire link when a breach was found. Anonymizer.com initially used Type 0 remailer for Web services. And then the Onion Routing method, encryption in layers, was developed by the US Navy research offices. The Freedom Network used the Onion concept, but depended on paid support for their proxy servers. Tor followed with nodes run by volunteers.

Today, there are a number of communication privacy systems with familiar names evolved since their beginnings.  PGP, or Pretty Good Privacy, for emails.  Secure Socket Layer and Transport Layer Security for web traffic and Off-the-Record Messaging for IM. The Off-the-Record provided deniability for IM participants.

                       
For a look at one review of certificate authorities:

For one list list of Trusted Root Certificate Authorities, 2/10/2010, and BuiltInCAs-January-2011

Thursday, February 17, 2011

Comments on “Here Come the Info Security Lawyers!” (Beautiful Security Chapter 12)


This chapter discusses culture, balance, communication, and doing the right thing.  The combination of these areas contributes to an overall security policy and protection system from unwanted security intrusions.

Culture – Within any culture, there are accepted ways of everyday life.  As with an organization, attitudes and approaches to events filter down from the top.  The early days of security breaches found easy paths into existing systems.  As different prevention from intrusions evolved, so did the complexity of the intrusions.  The initial approach to information protection was to gather information to detect existing techniques. Computer security was not an integral part of the information budgets.  

The need today is to make security an integral part of information budgets, beginning at the top levels.  Recommendation from the 2007 Commission on Cyber Security recommended a national security strategy.  This commission suggested balanced approach that “avoids prescriptive mandates,” and “avoids overreliance on market forces.”  A key suggestion is to incorporate “public-private advisory group” to focus on “key infrastructures.”  The overall trends need to begin at the top of all organizations for a cultural acceptance of information security that is an integral part of everyday events.

Balance – Organizations must take into account the cost of lost productivity and intellectual property due loses from security breaches.  Through the use of existing knowledge of insurance claims, surveys of cost to employees and process issues a more realistic cost of security needs becomes more visible.  Perfect accuracy is not necessary.  Consistency with determining organizational cost is more important.



Communication – Collaboration between technical and non-technical personnel provides the best avenues to protect the organization’s information.  Communicating potential areas affected by security intrusions identifies the actual negative costs.  This information compared to the cost of security measures provides a look at the return on investment for security of that organization.

Doing the Right Thing -


COMPLIANCE <=/=> SECURITY

Compliance with all organizational regulations is not the same as having a secure information security.  Neither does having a secure information system the same as fulfilling all of the organizational requirements. Consideration technical factor associated with an organization.  The cost of the security applications must be included in policy and organization processes.


Monday, February 14, 2011

Beautiful Security Chapter 12, Here Come the Infosecurity Lawyers!

Compliance Issues

I submit that part of the lack of desire to be totally compliant is the attitude about compliance throughout an organization. In a government contracts based organization lack of compliance can be at minimum a fine, and potentially loss of job.

An example of compliance/non-compliance: There are people who continue to drink and drive after one or more DUIs, and students who refuse to study; cheating on an exam.  Apparently, the consequences have no weight in their decisions.  I agree with other comments made on by my peers’ discussion for EM 835, that a better understanding of the consequences to the person, the organization, and others effected. All begins with personal responsibility.  My examples of DUI suggest a need for better understanding of consequences of actions and in-action.

Perhaps, my favorite comment made in one of Steven Covey books about habits and leadership principles: 'we would not need any laws, if everyone obeyed the laws!'

Laws are for protecting.  Punishment sometimes is too little, too late!  From my psychology classes a punishment tends to minimize a behavior.  Punishment alone does not necessarily eliminate the bad habit.  Perhaps cognitively accepting responsibility for an action and the determination ‘to never perform’ that bad habit again has a stronger chance at elimination.

Unless security is an integral part a system, at least minimizing security intrusions and lack of compliance, breeches may be nearly impossible!  Culture has some influence.  Rewarding and glorifying bad behavior can influence some in the wrong ways.