Was something missing from the Cisco Annual Cybersecurity Report?

security compromizeAccording to Cisco’s 2018 Annual Cybersecurity Report:

  • “Burst attacks” or short DDoS attacks affect 42% of the organizations studied
  • Insider threats are still a huge issue
  • More Operational Technology and IoT attacks are coming
  • Hosting in the cloud as a side benefit of greater security
  • Nearly half of security disks come from having multivendor environments
  • New domains tied to SPAM campaigns

Many of these findings seem like common sense or in some ways in CISCO’s interest at first glance, but this 60+ page report goes into much greater detail than these one-liners. It breaks down the analysis by region and time and concludes about the difficulties of cyber defense:

“One reason defenders struggle to rise above the chaos of war with attackers, and truly see and understand what’s happening in the threat landscape, is the sheer volume of potentially malicious traffic they face. Our research shows that the volume of total events seen by Cisco cloud-based endpoint security products increased fourfold from January 2016 through October 2017”

The breadth and volume of attacks can overwhelm any organization and it is not a case of ‘if’ but ‘when’.

One thing I didn’t see mentioned at all was cryptojacking, the unapproved leveraging of processing cycles for mining cryptocurrency. This form of cybersecurity risk affects large entities as well as individuals through their access of websites. Generally, this is less destructive than the previous cyber attack methods and may even be seen as an alternative to advertisements on sites, but it seemed odd to me that this rapidly advancing trend wasn’t mentioned.

The report is still worth looking over.


3D printing with Cura on the Raspberry Pi

Since I had a bit of time on my hands, I spent some time this weekend switching over the software I was using for 3D printing. Since I first got my 3D printer 5 or 6 years ago, I’ve been using Repetier under MS-Windows. This is a very flexible solution but its Raspberry Pi implementation is only as a server that you would access over the web which is nice, but you can’t see the model progress while printing. I’ll need to experiment with this more though.

There is a Cura implementation that ran on top of Octopi. This print controller will allow me to transfer information directly to the printer, initiate printing and monitor it remotely over the web. Here is the main interface:


One added bonus of making the change to Cura and Octopi is that I can monitor the printing process remotely using a USB camera (that I had lying around) — this capability was just built in. Here is what that looks like:


The first 2 prints I tried came off flawlessly, though I do have a small X axis offset issue to center the print that I’ve yet to resolve.

If you have a spare Raspberry Pi lying around it is definitely worth looking into. I also want to try using Slic3r on the Pi as an alternative 3D slicer.

Elastic Map Reduce on AWS

derived dataLast week, I put out a post about Redshift on AWS as an effective tool to quickly and dynamically put your toe in a large data warehouse environment.

Another tool from AWS that I experimented with was Amazon’s Elastic Map Reduce (EMR). This is an open source Hadoop installation that supports MapReduce as well as a number of other highly parallel computing approaches. EMR also supports a large number of tools to help with implementation (keeping the environment fresh) such as:  PigApache HiveHBase, Spark, Presto… It also interacts with data from a range of AWS data stores like: Amazon S3 and DynamoDB.

EMR supports a strong security model, enabling encryption at rest as well as on the move and is available in GovCloud, handling a range of big data use cases, including log analysis, web indexing, data transformations (ETL), machine learning, financial analysis, scientific simulation, and bioinformatics.

For many organizations, a Hadoop cluster has been a bridge to far for a range of reasons including support and infrastructure costs and skills. EMR seems to have effectively addressed those concerns allowing you to set up or tear down the cluster in minutes, without having to worry much about the details of node provisioning, cluster setup, Hadoop configuration, or cluster tuning.

For my proof of concept efforts, the Amazon EMR pricing appeared to be simple and predictable allowing you to pay a per-second rate for the clusters installation and use — with a one-minute minimum charge (it used to be an hour!). You can launch a 10-node Hadoop cluster for less than a dollar an hour (naturally, data transport charges are handled separately). There are ways to keep your EMR costs down though.

The EMR approach appears to be focused on flexibility, allowing complete control over your cluster. You have root access to every instance and can install additional applications and customize the cluster with bootstrap actions (which can be important since it takes a few minutes to get a cluster up and running), taking time and personnel out of repetitive tasks.

There is a wide range of tutorials and training available as well as tools to help estimate billing.

Overall, I’d say that if an organization is interested in experimenting with Hadoop, this is a great way to dive in without getting soaked.

It’s Engineers Week


This week is National Engineers Week.

Founded by NSPE in 1951, Engineers week (February 18-24, 2018) is dedicated to ensuring a diverse and well-educated future engineering workforce by increasing understanding of and interest in engineering and technology careers. The emphasis is on:

  • Celebrating how engineers make a difference in our world
  • Increasing public dialogue about the need for engineers
  • Bringing engineering to life for kids, educators, and parents

What engineering related activities will be going on near you?

NIST standards draft for IoT Security

IoTThe draft version of NIST’s “Interagency Report on Status of International Cybersecurity Standardization for the Internet of Things (IoT)” was  released this week and is targeted at helping policymakers, managers and standards organizations develop and standardize IoT components, systems and services.

The abstract of this 187 page document states: “On April 25, 2107, the IICS WG established an Internet of Things (IoT) Task Group to determine the current state of international cybersecurity standards development for IoT. This Report is intended for use by the IICS WG member agencies to assist them in their standards planning and to help to coordinate U.S. government participation in international cybersecurity standardization for IoT. Other organizations may also find this useful in their planning.”

The main portion of the document is in the first 55 pages with a much larger set of annex sections covering definitions, maturity model, standards mappings… that will be likely of great interest to those strategizing on IoT.

The document is a great starting point for organizations wanting an independent injection of IOT security perspectives, concerns and approaches. My concern though is the static nature of a document like this. Clearly, this Information Technology area is undergoing constant change and this document will likely seem quaint to some very quickly but be referenced by others for a long time in the future. A wiki version may make this more of a useful, living document.

Comments on the draft are due by April 18. Reviewers are encouraged to use the comment template, and NIST will post comments online as they are received.

AWS Redshift and analytics?

data insightRecently, I had the opportunity to test out Amazon Redshift. This is a fast, flexible, fully managed, petabyte-scale data warehouse solution that makes it simple to cost effectively analyze data using your existing business intelligence tools. It’s been around for a while and matured significantly over the years.

In my case, I brought up numerous configurations of multi-node clusters in a few minutes, loaded up a fairly large amount of data, did some analytics and brought the whole environment down – at a cost of less than a dollar for the short time I needed it.

There are some great tutorials available and since Amazon will give you an experimentation account to get your feet wet. You should be able to prove out the capabilities to yourself without costing you anything.

The security of the data is paramount to the service, since it is available in public AWS as well as GovCloud and can be configured to be HIPAA or ITAR compliant… Data can be compressed and encrypted before it ever makes it to AWS S3.

You can use the analytic tools provided by Amazon or use security groups to access your data warehouse using the same tools you would use on-site. During my testing, I loaded up both a large star schema database as well as some more traditionalize normalized structures.

Since this is only a blog post, I can’t really go into much detail and the tutorials/videos are sufficient to bootstrap the learning process. The purpose of this post is to inform those who have data warehouse needs but not the available infrastructure that there is an alternative worth investigating.

Got the VPN working in my new house

vpnNow that I’ve moved into my new house, I wanted to get a VPN on the Raspberry Pi working. Having a VPN will allow me to log into my home network securely, no matter where I am. Just thought I’d document the process I used, in case it is useful to anyone else.

The process is straightforward.

  1. First, I wrote down the address of my router and assigned my Raspberry Pi a fixed IP address on my LAN.
  2. Next, I got an account with a Dynamic DNS provider. This is not necessary, but does make using the VPN more useful, if your ISP ever changes your IP address. I used to use DuckDNS, but found out that my router had an interface in the security settings to noIP, so I used noIP. I defined my address information on the Dynamic DNS system. Now whenever the router sees that the address has changed, it should update the domain I have defined. I can just use that address to reach the VPN server.
  3. I installed Raspian on my Raspberry PI and then used the pivpn command to set it up:
    curl -L https://install.pivpn.io | bash
  4. This process is well documented on the pivpn site. Once that was completed, I made sure the Pi was up to date with the updated command:
    sudo apt-get upgrade
  5. Next you need to go into your router and defined the dynamic DNS information as well as define the port forwarding to your Raspberry Pi
  6. Now you can use the pivpn -a command on the Raspberry Pi to create a new certificate for each device that will use the VPN.
  7. Then I installed OpenVPN on the devices (android, windows, linux…) and provided them the key file created in step 6.

That’s a very high-level overview of the process. The VPN allows me to see the screen, access files… just like the device was on my home network, when I am away.