// JSON-LD for Wordpress Home, Articles and Author Pages. Written by Pete Wailes and Richard Baxter. // See: http://builtvisible.com/implementing-json-ld-wordpress/

Posts Tagged ‘Internet’

Heartbleed – A High-level Look

Saturday, April 12th, 2014

HeartbleedThere has been a lot of information flying about on the Internet concerning the Heartbleed vulnerability in the OpenSSL library. Among system administrators and software developers there is a good understanding of exactly what happened, the potential data losses and proper mitigation processes. However, I’ve seen some inaccurate descriptions and discussion in less technical settings.

I thought I would attempt to explain the Heartbleed issue at a high level without focusing on the implementation details. My goal is to help IT and business leaders understand a little bit about how the vulnerability is exploited, why it puts sensitive information at risk and how this relates to their own software development shops.

Heartbleed is a good case study for developers who don’t always worry about data security, feeling that attacks are hard and vulnerabilities are rare. This should serve as a wake-up-call that programs need to be tested in two ways – for use cases and misuse cases. We often focus on use cases, “does the program do what we want it to do?” Less frequently do we test for misuse cases, “does the program do things we don’t want it to do?” We need to do more of the latter.

BusinessSecurityBrief: Heartbleed - TitleSlideI’ve created a 10 minute video to walk through Heartbleed. It includes the parable of a “trusting change machine.” The parable is meant to explain the Heartbleed mechanics without requiring that the viewer be an expert in programming or data encryption.

If you have thoughts about ways to clarify concepts like Heartbleed to a wider audience, please feel free to comment. Data security requires cooperation throughout an organization. Effective and accurate communication is vital to achieving that cooperation.

Here are the links mentioned in the video:

Fuzzing – A Powerful Technique for Software Security Testing

Friday, January 21st, 2011

I was participating in a code review today and was reminded by a senior architect, who started working as an intern for me years ago, of a testing technique I had used with one of his first programs.  He had been assigned to create a basic web application that collected some data from a user and wrote it to a database.  He came into my office, announced it was done and proudly showed it to me.  I walked over to the keyboard, entered a bunch of junk and got a segmentation fault in response.

Although I didn’t have a name for it, that was a standard technique I used when evaluating applications.  After all, the tried and true paths, expected inputs and easy errors will be tested early and often as the developer exercises the application using the basic use cases.  As Boris Beizer said, “The high-probability paths are always tested if only to demonstrate that the system works properly.” (Beizer, Boris. Software Testing Techniques. Boston, MA: Thomson Computer Press, 1990: 76.)

It is unexpected input that is useful when looking to find untested paths through the code. If someone shows me an application for evaluation the last thing I need to worry about is using it in an expected fashion, everyone else will do that.  In fact, I default to entering data outside the specification when looking at a new application.  I don’t know that my team always appreciates the approach.  They’d probably like to see the application work at least once while I’m in the room.

These days there is a formal name for testing of this type, fuzzing.  A few years ago I preferred calling it “gorilla testing” since I liked the mental picture of beating on the application. (Remember the American Tourister luggage ad in the 1970s?)  But alas, it appears that fuzzing has become the accepted term.

Fuzzing involves passing input that breaks the expected input “rules”.  Those rules could come from some formal requirements, such as a RFC, or informal requirements, such as the set of parameters accepted by an application.  Fuzzing tools can use formal standards, extracted patterns and even randomly generated inputs to test an application’s resilience against unexpected or illegal input.

(more…)

Tag, You’re It!

Wednesday, January 12th, 2011

The Internet is full of examples of simplifications creating vulnerabilities.  A good number of these can be represented as indirection enablers.  IP addresses, domain names, URIs, tiny URLs, QR Codes and now Microsoft tags.  Each of these serves the purpose of simplifying and decoupling.  We have seen many exploits for the first four, what about these last two?

As you likely know, QR Codes and Microsoft tags are graphical images targeted at print media, though there is no reason they can’t be used in an online fashion.  They are most often presented as rectangular graphics (examples below).  The reason for using them is to provide an easy way for someone to access a web page (or other online resource) related to the printed content.  Since these images represent character data they can also be used to house information, like contact details, that do not require online access to interpret.

The use case is simple: install a special program that interprets the codes or tags; point the camera from a smart phone or computer at the graphic; and voilà, your phone presents a web page, phone number or other embedded content. Basically this avoids having to manually enter a URL.  Depending on a company’s marketing strategy this is a powerful feature since a particular ad might want to direct a person to a URL that embeds  information about the specific advertisement, media source, publication page and so forth.  Typing in a complicated URL would put off many people but this removes most of the effort while making the print media interactive.

The main issues with adoption are educating the public about the use of these codes and getting people to install the reader software.  Some of you may recall Radio Shack trying to do something similar several years ago.  They created a scanning device, given out for free, that people had to connect to their PCs.  They could then scan a specific item in a Radio Shack catalog or advertisement and be brought to a web page with detailed information and ordering instructions.

Although that particular attempt failed, these newer approaches have the advantages of being broadly available, leveraging a common accessory on a smart phone (camera) and providing benefits to more than one company.  It will be interesting to see if any of the competing standards catch on with the general public (beyond the two mentioned already there are others such as Data Matrix, Quickmark and PDF417).

My concern, however, isn’t whether these graphical links become popular, it is whether they present another security risk. I believe that they do, in a manner similar to Tiny URLs, yet possibly more insidious.

(more…)

2010 National Cybersecurity Awareness Month

Monday, October 4th, 2010

Welcome, October. There is a chill in the air here in the Northeast and visions of goblins, witches and ghosts are beginning to appear in front yards and on rooftops around the area. Although most of us associate these ideas with the paranormal, those same visions and chill serve to remind us to be ever vigilant when it comes to computer-based threats. So what better time of year to turn our attention to on-line phantoms such as viruses, worms and trojans?

NCSAM Banner

The National Cyber Security Alliance chose October as National Cybersecurity Awareness Month (NSCAM). Their website contains a lot of useful materials for businesses, educators and parents. This is a great resource to use as the basis for informing your company, family and self about on-line risks and effective practices for protecting yourself and others from on-line threats.

A core tenet of the alliance’s message is to “Get Involved” and my company, Blue Slate Solutions, is doing just that. Why should we get involved? Like many aspects of life, we are either part of the solution or part of the problem. Users of the Internet who do not understand on-line risks and fail to proactively protect themselves from being victims of cyberattacks become part of the problem.

(more…)

JavaOne 2010 Concludes

Saturday, September 25th, 2010

My last two days at JavaOne 2010 included some interesting sessions as well as spending some time in the pavilion.  I’ll mention a few of the session topics that I found interesting as well as some of the products that I intend to check out.

I attended a session on creating a web architecture focused on high-performance with low-bandwidth.  The speaker was tasked with designing a web-based framework for the government of Ethiopia.  He discussed the challenges that are presented by that country’s infrastructure – consider network speed on the order of 5Kbps between sites.  He also had to work with an IT group that, although educated and intelligent, did not have a lot of depth beyond working with an Oracle database’s features.

His solution allows developers to create fully functional web applications that keep exchanged payloads under 10K.  Although I understand the logic of the approach in this case, I’m not sure the technique would be practical in situations without such severe bandwidth and skill set limitations.

A basic theme during his talk was to keep the data and logic tightly co-located.  In his case it is all located in the database (PL/SQL) but he agreed that it could all be in the application tier (e.g. NoSQL).  I’m not convinced that this is a good approach to creating maintainable high-volume applications.  It could be that the domain of business applications and business verticals in which I often find myself differ from the use cases that are common to developers promoting the removal of tiers from the stack (whether removing the DB server or the mid-tier logic server).

One part of his approach with which I absolutely concur is to push processing onto the client. The use of the client’s CPU seems common sense to me.  The work is around balancing that with security and bandwidth.  However, it can be done and I believe we will continue to find more effective ways to leverage all that computer power.

I also enjoyed a presentation on moving data between a data center and the cloud to perform heavy and intermittent processing.  The presenters did a great job of describing their trials and successes with leveraging the cloud to perform computationally expensive processing on transient data (e.g. they copy the data up each time they run the process rather than pay to store their data).  They also provided a lot of interesting information regarding options, advantages and challenges when leveraging the cloud (Amazon EC2 in this case).

(more…)

Man in the Middle? No, Just Your Carrier

Saturday, January 23rd, 2010

As you may be aware, several individuals using AT&T-based cellular phones recently reported being logged into the wrong Facebook account when accessing the Facebook site from their phones.  Current reports indicate that the root cause is AT&T’s network, which misdirected Facebook cookies.  These cookies, set to reflect that an individual has logged in, are to be stored on each user’s device.  Is this issue a cause for concern?  Is the issue likely limited to Facebook?  Does Facebook bear any responsibility?

In terms of concern, I’d say there is cause for major concern. We implicitly trust that the single request/response interaction between the browser and the server must be represented by a single network connection.  Unless an attacker inserts him or her self into the virtual connection circuit, the server’s response to the browser (containing the cookie) must be the same connection that sent the original credentials.

In this case that trust appears to be misplaced.  It is easy to understand how this is possible.  The carriers are free to manage connections however they choose.  In reality the carrier is likely proxying between the cellular network and the Internet, like any NAT-based approach.  A little coding error, such as an improperly shared resource, and results destined for one phone are returned to another.

Classically this seems like a race condition.  Certainly in the latest incident that explanation seems consistent with the facts since the two people who ended up on each other’s Facebook accounts were on-line at the same time.  Nothing particularly interesting about multi-threaded code containing a race condition.  It has happened before and will happen again.

This leads me to my second question, is this likely limited to Facebook users on the AT&T network? That seems doubtful.  It is hard to imagine that the carrier’s infrastructure that proxies requests includes specialized instructions just for Facebook.  it seems very likely that any connection-related flaw can occur for any web interaction.

(more…)

Project H.M.

Sunday, December 6th, 2009

I have been following the work being done by The Brain Observatory at UCSD to carefully section the brain of patient H.M. The patient, whose identity was protected while he was living, is known as the most studied amnesiac.  His amnesia was caused by brain surgery he underwent when he was 27 years old.

Screenshot from the live broadcast of Project H.M.'s brain slicing process

Screenshot from the live broadcast of Project H.M.'s brain slicing process

I won’t redocument his history, it is widely available on various websites, a few of which I’ll list at the end of this posting.  For me, this study is fascinating in terms of the completely open way the work is being done.  The process of sectioning the brain was broadcast in real time on UCSD’s website.  The entire process that is being followed is being discussed in an open forum.  The data being collected will be freely available.  For me this shows the positive way that the web can be leveraged.

I spend so much time in the world of commercial and proprietary software solutions that I sometimes end up with a distorted view of how the web is used.  Most of my interactions on the web are in the creation of applications that are owned and controlled by companies whose content is only available to individuals with some sort of financial relationship with the web site owner.

Clearly sites like Wikipedia make meaningful content available at no cost to the user.  However, in the case of this work at UCSD, there is an enormous expense in terms of equipment and people in order to collect, store, refine and publish this data.  This is truly a gift being offered to those with an interest in this field.  I’m sure that other examples exist and perhaps a valuable service would be one that helps to organize such informational sites.

If you are interested in more information about H.M. and the project at UCSD, here are some relevant websites:

Net Neutrality: Is There a Reason for Concern?

Monday, October 12th, 2009

Lately the subject of net neutrality has garnered a lot of attention.  As businesses large and small create an ever increasing set of offerings that require lots of bandwidth there is concern that the Internet infrastructure may not be able to keep data flowing smoothly

The core of the Internet’s bandwidth is provided by a few businesses.  These businesses exist to make money.  Fundamentally, when demand exceeds supply the cost of the good or service goes up.  In this case those costs might appear as increased charges for access or a slowing of one company’s data transfer versus another.

As in many debates there are two extreme positions represented by individuals, companies and trade groups.  In this case the dimension being debated is whether there is a need to legislate a message-neutral Internet (Net Neutrality)

The meaning of being “neutral” is that all data flowing across the Internet would be given equal priority.  The data being accessed by a doctor reading a CAT scan from a health records system would receive the same priority as someone watching a YouTube video.

Although the debate surrounds whether net neutrality should be a requirement, the reasons for taking a position vary.  I’ll start with concerns being shared by those that want a neutral net to be guaranteed.

Why Net Neutrality is Important

The Internet has served as a large and level playing field.  With a very small investment, companies can create a web presence that allows them to compete as peers of companies many times their size.

Unlike the brick-and-mortar world where the location, size, inventory, staff and ambiance of a store have direct monetary requirements, a web site is limited by the creativity and effort of a small team with an idea.  If Amazon and Google had needed to create an infrastructure on par with Waldenbooks or IBM in order to get started they would have had a much tougher journey.

Data on the Internet should be equally accessible.  It should not matter who my Internet Service provider (ISP) is.  Nor should my ISPs commercial relationships have a bearing on the service they provide to me to access the services of my choice.

If I choose to use services from my ISP’s competitor I should have access equivalent to using my ISP’s offering.  For instance, if I choose to use Google’s portal versus my ISP’s portal, the data sent by Google must not be impeded in favor of customer’s requesting my ISP’s content.

Network discrimination would dismantle the fundamental design of the Internet.  One of the original design goals for the Internet was its ability to get data from one place to another without regard for the actual content.  In other words, the underlying transport protocol knows nothing about web pages, SSH sessions, videos, and Flash applications.  It is this service agnosticism that has allowed myriad services to be created without having to fundamentally reengineer the Internet backbone.

An infrastructure that used to only routinely carry telnet and UUCP sessions now pass all manner of protocols and data.  Adding a layer of discrimination would require altering this higher-level agnosticism, since it is typically through inspection of the payload that one would arrive at decisions regarding varying the level of service.  This would lead the Internet down a road away from its open standards-based design.

There are other specific arguments made in favor of ensuring net neutrality.  In my opinion most of them relate to one of these I’ve mentioned.  So why oppose the requirement of net neutrality? (more…)