Oyente - code analysis for Ethereum contracts

Software may be secure by design, but not by implementation. Every software release process must include a battery of automated security tests. Oyente is a cool new static code analysis tool for Ethereum smart contracts. It detects common bugs related to transaction ordering, time stamps, and reentrancy. The original conference paper Making Smart Contracts Smarter is well worth reading and has the clearest explanation I've seen of smart contract errors, why they happen, and how to avoid them. 

 

 

Get (re)Started with building on Ethereum

Chris Whinfrey wrote a great tutorial on getting started with Ethereum.  It took me less than an hour to download all the tools and run the test project that comes with Truffle. I created my own private blockchain running via testrpc, 10 wallets filled with 100 ether, and even 10,000 shiny MetaCoins! I could view them all via my MetaMask Chrome plug-in.

I turned off my computer, and went to bed.

Two days later, everything was gone. My ether evaporated and my dApp couldn't even check my MetaCoin balance.

This is a tutorial on how to get re-started with building on Ethereum.

First, you need to restart your private blockchain and seed it with the same accounts you had before (and have conveniently saved into MetaMask). The trick is to use a mnemonic to start testrpc.  The mnemonic lets you restart the blockchain with the same accounts, preloaded with 100 ether. Hopefully, you saved the output of your first testrpc run - it will include the HD Wallet mnemonic. 

testrpc --mnemonic="my special words that I saved"

What?! You didn't save the mnemonic from the first time you ran testrpc? That's ok. MetaMask still has the private keys for all your accounts. You can manually specify each account's private key and balance.

testrpc --account="7ba30c70faa34ca7032d7d0e17112345d7c2950f8063196d0488818881cab1ce,100" 

Now you need to push your MetaCoin contract to your private blockchain. Remember - testrpc starts with a clean blockchain. Go to your MetaCoin directory and type:

truffle migrate

Finally, you can run your dApp

npm run dev

Open up Chrome and go to http://localhost:8080. MetaMask should show that all your accounts have a balance of 100 ether, and you should have 10,000 MetaCoins again.

Siri on Security

Security Researchers are constantly finding new ways to hack locked iPhones using Siri

I should know. For the past year, I have been living with an incredibly talented and motivated Security Researcher: my one-year old son.  I naively thought that handing him a locked phone would keep him entertained. Wheee! The buttons change colors when you touch them! Instead, my In House Security Researcher was able to:

  • Take photos of the rug
  • Access the camera roll - and delete really cute photos!
  • Send emails
  • Place 911 calls
  • Place Skype calls to people on my Skype Contact list
  • Play music.
  • Make Siri execute random tasks which sound like "Mama"

Here is how you disable all this on lock screen:

  1. Disable Siri. Go to Settings->Siri and turn it off. 
  2. Notifications. Got to Notifications, and tap each program individually. Think carefully what you want to allow on Lock Screen.  Some of these may accidentally allow access to the actual program when locked.
  3. Update Lock Screen. Go to Touch ID and Passcode In the section Allow Access When Locked, turn everything off. This includes Today View, Notifications View, Reply with Message, Home Control, and Wallet.  You want 

Getting the Camera (and your photos!) off the Lock Screen is not very convenient. You need to go to Settings->General->Restrictions and then enable Restrictions. At this point, you can select apps to remove entirely.  They will be gone from the Lock Screen - and your phone.  You would need to go back to Settings->General->Restrictions and enable the camera to get it back.  For most people, this is not worth the hassle.

Note: In iOS 10, Apple disabled most Siri settings, so hopefully we will have less "Mama" commands in the future.

 

 

Build OpenSSH 7.4p with Cygwin on Windows

There is a lot of conflicting information out there. But once you have the correct instructions, it is actually quite easy.

1. Install Cygwin. You will need to install the following packages:

  • zlib
  • crypt
  • openssl-devel
  • libedit-devel
  • libkrb5-devel
  • autoconf-2.69

You can update Cygwin to add more packaes. Run Cygwin-Setup (the installer) again. It will prompt you to select more packages. Make sure that the packages above are marked "Keep" and not "Skip."

2. Download Portable OpenSSH. You must download openssh-7.4p1.tar.gz from one of the mirrors. DO NOT GET THE SOURCE VIA GIT. It is missing the configure script and the instructions for generating it via the autoreconf command do not work. Do this one simple thing and you will save yourself a lot of pain. Trust me.

3. Unpack OpenSSH. Open Cygwin and cd into the directory containing openssh-7.4p1.tar.gz

$ gzip –d openssh-7.4p1.tar.gz
$ tar –xvf openssh-7.4p1.tar

This will create a new directory openssh-7.4p1 which contains all the source files.

4. Build OpenSSH. Open Cygwin and go into the directory containing the OpenSSH source files. This is the directory you created in step 4.

$ ./configure -–prefix=DIR
$ make
$ make install

This will build and install OpenSSH into the directory DIR. If, for some crazy reason, you want to install OpenSSH on a drive other than C, you can use /cygdrive/f/openssh. 

5. OpenSSH uses typical Makefile commands. A couple you should know about

$ make tests
$ make clean 
$ make uninstallall 

I found the directions in the file INSTALL very useful. The instructions in /contrib/Cygwin/README are deprecated. 

Nurse wants to email

I had an interesting conversation today with a nurse who teaches nursing students. When her students send her clinical notes, they are very careful to not mention any identifying information about the patient. This is a very important part of nursing culture that is instilled early into all students.

But what if two nurses (or doctors) need to email each other about a patient? Surely there must be a way. And there is: Some email providers are HIPAA compliant. 

GSuite. This is Google's paid email service. It also comes with a HIPAA compliant calendar and file sharing system.

Office 365. This is Microsoft's online suite of email, calendar, file sharing, and communication. It also gives you a license to use Microsoft desktop Office products such as Word.  The Microsoft BAA applies to all commercial and educational versions.

There are also many lesser known companies that offer specifically HIPAA compliant email. These are generally based on Microsoft's Exchange server, with the provider taking care of maintaining security.

Crossing the border

A NASA scientist was recently forced to give up the PIN to his work phone.

Your privacy rights are significantly reduced when crossing national borders. This can become an issue if you want to bring a phone or laptop that you use for business.  You might have legal or contractual requirements to keep the information on your devices confidential.

Doctors could have patient phone numbers stored on the phone, either explicitly in the address book or implicitly as a list of recent phone-calls.  Even if you can't see them, they are likely still there.

Lawyers could have attorney-client privileged documents, emails, etc.

Engineers could have intellectual property.

US Customs or ICE can force you to surrender your device(s) for up to 30 days for a forensic analysis.  They do not need any probable cause or suspicious. What they may NOT do is force you to reveal any PIN numbers and passwords without a court order. Unfortunately, if you choose to exercise your right to remain silent, you may be detained long enough to miss your flight, barred from entering the country, or otherwise severely inconvenienced. The rules in other nations vary, and in some countries, you can be forced to surrender both your mobile device and your passwords.

The best way to protect sensitive data is not to physically bring it with you across national borders. That way you cannot be forced to give up your passcodes or decryption keys. 

 

 

Do your workers use cell-phones?

There are a couple of steps you need to take to ensure the information on your worker's mobile devices is secure. This is especially important if you need to be HIPAA compliant. 

  • Password protect your device. This is the first layer of protection for any mobile device.  Make sure that all apps (camera, Siri, etc) are disabled whenever you are not logged into your device.
  • Disable Text/Email notifications. You don't want the text of emails or SMS text messages to pop-up on your locked screen.
  • Turn on full disk-encryption. OCR will levy fines based on the assumption that all PHI on your phone is compromised unless the phone is encrypted.
  • Turn off cloud back-up. You cannot use any cloud service to store PHI unless the provider executes a BAA with you.
  • Disable automatically connecting to WiFi hotspots. Setting up a public WiFi network near malls, airports, hotels is a common technique for gathering information. Train your workforce to only use locked WiFi networks they know.
  • Enable Tracking. This lets you recover the phone if it is merely lost and avoid having to report an incident to OCR.
  • Enable Remote Wipe. This will erase all data from your phone if it is lost or stolen. 
  • Configure Secure Messaging. Default programs that come with the phones - Apple Mail, text SMS, iMessage, etc are not HIPAA compliant. Your business needs a suite of secure email and messaging apps.

Medical businesses need to implement operational procedures that ensure that all their worker's devices are HIPAA compliant.

Encrypt your computer

The Office of Civil Rights at the U.S. Department of Health and Human Services demands hefty penalties from companies whenever a stolen laptop compromises PHI because it was unencrypted.  The average fine is over $880,000 per laptop!  You can avoid this penalty by ensuring all employees and contractors activate full disk encryption on their computers.

Mac OS X. Turn on file FileVault. It is available on all editions starting with 10.10 Yosemite.  Beginning with Lion, all Mac OS X machine are encrypted by default.

Windows. Turn on BitLocker. It is available on Windows 10 Pro, Education, and Enterprise; on Windows 8 and 8.1 Pro and Enterprise; on Windows 7 Ultimate and Enterprise; and on Windows Vista Ultimate and Enterprise.

Linux. You can activate full disk encryption on Ubuntu versions 12.10 and later during installation. 

ChromeBook. Encryption is enabled by default, unless you are in Developer Mode. 

iPhone, iPad, or iPod. Turning on the Passcode automatically activates full disk encryption. Make sue you use a 6 digit passcode or longer. Available for iPhone 3GS or later, all iPads, and iPod 3rd Generation or later.

Android. Full disk encryption is available starting with Gingerbread 2.3.X and enabled by default starting with 5.X. Android disk encryption has some known vulnerabilities that iOS does not have.

Windows Mobile. You can easily encrypt Windows 10 phones. Windows phone 8.1 supports disk encryption, but only when it is connected to a device management server.

First they came for SHA-1

Cryptographic hash functions are a basic building block of cryptography.  They have three very important properties:

  1. Pre-image resistance. Given a value h, it is hard to find a message m such that hash(m)=h.
  2. Second pre-image resistance. Given a message m1, it is hard to find a second message m2 such that hash(m1)=hash(m2).
  3. Collision resistance. It is hard to find a pair of messages (m1, m2) such that hash(m1)=hash(m2).

Cryptographic hash functions have a lifetime of about 10-20 years before they are broken.  Of the hash algorithms you typically see in common APIs, MD4, MD5, and now even SHA-1 are broken. The SHA-2 and SHA-3 family are still considered secure.....for now.

Many security protocols such as TLS and SSH include an algorithm negotiation phase where the client and server mutually agree which cryptographic functions to use.  Crypto agility is vital for long-term security. 

PHI - Protected Health Information

If you want to secure you PHI, you first got to find it. 

Protected health information (PHI) under US law is any information about health status, provision of health care, or payment for health care that is created or collected by a "Covered Entity" (or a Business Associate of a Covered Entity), and can be linked to a specific individual. Source: https://en.wikipedia.org/wiki/Protected_health_information

Some places you can look:

Calendar. This is how two Phoenix cardiologists earned a $100,000 fine. 

Email.  Do your doctors email each other about patients?  Do they get emails from patients? Now you need to protect the emails stored on your servers and mirrored to the Inbox on every device you own.

Voicemail. Do your patients leave you messages?  You need to make sure your voicemail box is secure.  Similarly, be wary of calls hosted by the cloud.

Text Messages. If your patients send you text messages (which I hope they don't because it is an an open channel!) the text history stored on your phone is now PHI.

Billing. You get paid, right?  Well, the payment records link patients to you. This means your billing information is not just PCI, it is also PHI!

The Cloud. Somewhere out there, over the rainbow, you have a server with patient data.  You need to protect it.  If any of that data is ever unencrypted, or if the decryption keys are stored by the same cloud service as the data, you need to sign a BAA with the cloud service provider.

The Guy With Three Computer Screens on His Desk. Otherwise known as the developer, QA tester, database manager, or system administrator.  He has access to the PHI (which he innocuously calls "production data") because he codes/tests/administers the system where it is stored. But testing might require obtaining  a local copy...and you see where this is going. If you outsource any of your dev/qa/sys admin functions to outside contractors you need a rock-solid BAA.

The Ridiculously Expensive Printer. The bigger and more expensive, the more likely it stored electronic copies of all PHI you sent it to print. Affinity Health paid a $1.2 million penalty for data discovered on the hard drive of its copy machines.

Laptops, cell-phones, and computers. If your company handles PHI, just assume that all devices that ever connect to your system MAY have PHI on it.  (even if they're not supposed to). Concerta Health Services paid a $2 million dollar penalty due to a stolen laptop.

Browser Cache. You thought you placed your PHI safely in the cloud. But there it is, on every laptop, cell-phone, and workstation that ever connects to your web-portal (see above).

USB drives. Those little devices you use to transfer data between computers when you don't have a secure file transfer service available.  

The Recycling Bin. Because that's where you put your PHI after you shredded it.

All of the above examples constitute "data-at-rest." Your job is to encrypt it, create a back-up, and encrypt the back-up. Then add some audit and access control. If a separate organization has access to the data, you need to sign a BAA to ensure that they do the same.

HIPAA Compliance

Check-out this excellent check-list from HIPAA Journal on what it takes to achieve HIPAA compliance: The HIPAA Check-List.

Security is more than just encryption. It is an entire system of policies, procedures, training, equipment, legal agreements, and contingency planning derived from a formal risk analysis for your business.

Security is NOT your next Killer Feature

People often treat security as an extra feature. Except it doesn’t do anything. It’s not pretty. It doesn’t generate more sales. It may even inconvenience your users. Security is more like an anti-feature that eats up your development time.  It’s…..its…plumbing. It has to be there. It has to be done right and up to code.  And, if it fails….. le deluge.

Security as Plumbing

When you want to buy a house, you don't peruse blue prints of pipes and air ducts. Yet their location will determine where you can build the extra bathroom or a spare kitchen for the MIL apartment. A good security design will make it easy to add new features to your product. A bad one will require a full redesign/security inspection when all you want to do is add a new button or add extra servers for load balancing. And a really bad design will result in a flood of unexpected costs as you deal with a security breach.

If you want to deal as little with security as possible in your development lifecycle, then you need to design the security architecture right from the beginning.