3D printing with Cura on the Raspberry Pi

Since I had a bit of time on my hands, I spent some time this weekend switching over the software I was using for 3D printing. Since I first got my 3D printer 5 or 6 years ago, I’ve been using Repetier under MS-Windows. This is a very flexible solution but its Raspberry Pi implementation is only as a server that you would access over the web which is nice, but you can’t see the model progress while printing. I’ll need to experiment with this more though.

There is a Cura implementation that ran on top of Octopi. This print controller will allow me to transfer information directly to the printer, initiate printing and monitor it remotely over the web. Here is the main interface:

interface

One added bonus of making the change to Cura and Octopi is that I can monitor the printing process remotely using a USB camera (that I had lying around) — this capability was just built in. Here is what that looks like:

camera

The first 2 prints I tried came off flawlessly, though I do have a small X axis offset issue to center the print that I’ve yet to resolve.

If you have a spare Raspberry Pi lying around it is definitely worth looking into. I also want to try using Slic3r on the Pi as an alternative 3D slicer.

Advertisements

Elastic Map Reduce on AWS

derived dataLast week, I put out a post about Redshift on AWS as an effective tool to quickly and dynamically put your toe in a large data warehouse environment.

Another tool from AWS that I experimented with was Amazon’s Elastic Map Reduce (EMR). This is an open source Hadoop installation that supports MapReduce as well as a number of other highly parallel computing approaches. EMR also supports a large number of tools to help with implementation (keeping the environment fresh) such as:  PigApache HiveHBase, Spark, Presto… It also interacts with data from a range of AWS data stores like: Amazon S3 and DynamoDB.

EMR supports a strong security model, enabling encryption at rest as well as on the move and is available in GovCloud, handling a range of big data use cases, including log analysis, web indexing, data transformations (ETL), machine learning, financial analysis, scientific simulation, and bioinformatics.

For many organizations, a Hadoop cluster has been a bridge to far for a range of reasons including support and infrastructure costs and skills. EMR seems to have effectively addressed those concerns allowing you to set up or tear down the cluster in minutes, without having to worry much about the details of node provisioning, cluster setup, Hadoop configuration, or cluster tuning.

For my proof of concept efforts, the Amazon EMR pricing appeared to be simple and predictable allowing you to pay a per-second rate for the clusters installation and use — with a one-minute minimum charge (it used to be an hour!). You can launch a 10-node Hadoop cluster for less than a dollar an hour (naturally, data transport charges are handled separately). There are ways to keep your EMR costs down though.

The EMR approach appears to be focused on flexibility, allowing complete control over your cluster. You have root access to every instance and can install additional applications and customize the cluster with bootstrap actions (which can be important since it takes a few minutes to get a cluster up and running), taking time and personnel out of repetitive tasks.

There is a wide range of tutorials and training available as well as tools to help estimate billing.

Overall, I’d say that if an organization is interested in experimenting with Hadoop, this is a great way to dive in without getting soaked.

It’s Engineers Week

2018_engineers_week_logo_horizontal

This week is National Engineers Week.

Founded by NSPE in 1951, Engineers week (February 18-24, 2018) is dedicated to ensuring a diverse and well-educated future engineering workforce by increasing understanding of and interest in engineering and technology careers. The emphasis is on:

  • Celebrating how engineers make a difference in our world
  • Increasing public dialogue about the need for engineers
  • Bringing engineering to life for kids, educators, and parents

What engineering related activities will be going on near you?

NIST standards draft for IoT Security

IoTThe draft version of NIST’s “Interagency Report on Status of International Cybersecurity Standardization for the Internet of Things (IoT)” was  released this week and is targeted at helping policymakers, managers and standards organizations develop and standardize IoT components, systems and services.

The abstract of this 187 page document states: “On April 25, 2107, the IICS WG established an Internet of Things (IoT) Task Group to determine the current state of international cybersecurity standards development for IoT. This Report is intended for use by the IICS WG member agencies to assist them in their standards planning and to help to coordinate U.S. government participation in international cybersecurity standardization for IoT. Other organizations may also find this useful in their planning.”

The main portion of the document is in the first 55 pages with a much larger set of annex sections covering definitions, maturity model, standards mappings… that will be likely of great interest to those strategizing on IoT.

The document is a great starting point for organizations wanting an independent injection of IOT security perspectives, concerns and approaches. My concern though is the static nature of a document like this. Clearly, this Information Technology area is undergoing constant change and this document will likely seem quaint to some very quickly but be referenced by others for a long time in the future. A wiki version may make this more of a useful, living document.

Comments on the draft are due by April 18. Reviewers are encouraged to use the comment template, and NIST will post comments online as they are received.

AWS Redshift and analytics?

data insightRecently, I had the opportunity to test out Amazon Redshift. This is a fast, flexible, fully managed, petabyte-scale data warehouse solution that makes it simple to cost effectively analyze data using your existing business intelligence tools. It’s been around for a while and matured significantly over the years.

In my case, I brought up numerous configurations of multi-node clusters in a few minutes, loaded up a fairly large amount of data, did some analytics and brought the whole environment down – at a cost of less than a dollar for the short time I needed it.

There are some great tutorials available and since Amazon will give you an experimentation account to get your feet wet. You should be able to prove out the capabilities to yourself without costing you anything.

The security of the data is paramount to the service, since it is available in public AWS as well as GovCloud and can be configured to be HIPAA or ITAR compliant… Data can be compressed and encrypted before it ever makes it to AWS S3.

You can use the analytic tools provided by Amazon or use security groups to access your data warehouse using the same tools you would use on-site. During my testing, I loaded up both a large star schema database as well as some more traditionalize normalized structures.

Since this is only a blog post, I can’t really go into much detail and the tutorials/videos are sufficient to bootstrap the learning process. The purpose of this post is to inform those who have data warehouse needs but not the available infrastructure that there is an alternative worth investigating.

Got the VPN working in my new house

vpnNow that I’ve moved into my new house, I wanted to get a VPN on the Raspberry Pi working. Having a VPN will allow me to log into my home network securely, no matter where I am. Just thought I’d document the process I used, in case it is useful to anyone else.

The process is straightforward.

  1. First, I wrote down the address of my router and assigned my Raspberry Pi a fixed IP address on my LAN.
  2. Next, I got an account with a Dynamic DNS provider. This is not necessary, but does make using the VPN more useful, if your ISP ever changes your IP address. I used to use DuckDNS, but found out that my router had an interface in the security settings to noIP, so I used noIP. I defined my address information on the Dynamic DNS system. Now whenever the router sees that the address has changed, it should update the domain I have defined. I can just use that address to reach the VPN server.
  3. I installed Raspian on my Raspberry PI and then used the pivpn command to set it up:
    curl -L https://install.pivpn.io | bash
  4. This process is well documented on the pivpn site. Once that was completed, I made sure the Pi was up to date with the updated command:
    sudo apt-get upgrade
  5. Next you need to go into your router and defined the dynamic DNS information as well as define the port forwarding to your Raspberry Pi
  6. Now you can use the pivpn -a command on the Raspberry Pi to create a new certificate for each device that will use the VPN.
  7. Then I installed OpenVPN on the devices (android, windows, linux…) and provided them the key file created in step 6.

That’s a very high-level overview of the process. The VPN allows me to see the screen, access files… just like the device was on my home network, when I am away.

Six thoughts on mobility trends for 2018

mobility walkLet’s face it, some aspects of mobility are getting long in the tooth. The demand for more capabilities is insatiable. Here are a few areas where I think 2018 will see some exciting capabilities develop. Many of these are not new, but their interactions and intersection should provide some interesting results and thoughts to include during your planning.

1. Further blurring and integration of IoT and mobile

We’re likely to see more situations where mobile recognizes the IoT devices around them to enhance contextual understanding for the user. We’ve seen some use of NFC and Bluetooth to share information, but approaches to embrace the environment and act upon the information available is still in its infancy. This year should provide some significant use cases and maturity.

2. Cloud Integration

By now most businesses have done much more than just stick their toe in the cloud Everything as a Service (XaaS) pool. As the number of potential devices in the mobility and IoT space expand, the flexibility and time to action that cloud solutions facilitate needs to be understood and put into practice. It is also time to take all the data coming in from these and transform that flow into true contextual understanding and action, also requiring a dynamic computing environment.

3. Augmented reality

With augmented reality predicted to expend to a market somewhere between $120 and $221 billion in revenues by 2021, we’re likely to see quite a bit of innovation in this space. The wide range of potential demonstrates the lack of a real understanding. 2018 should be a year where AR gets real.

4. Security

All discussions of mobility need to include security. Heck, the first month of 2018 has should have nailed the importance of security into the minds of anyone in the IT space. There were more patches (and patches of patches) on a greater range of systems than many would have believed possible just a short time ago. Recently, every mobile store (Apple, Android…) was found to have nefarious software that had to be exercised. Mobile developers need to be ever more vigilant, not just about the code they write but the libraries they use.

5. Predictive Analytics

Context is king and the use of analytics to increase the understanding of the situation and possible responses is going to continue to expand. As capabilities advance, only our imagination will hold this area back from increasing where and when mobile devices become useful. Unfortunately, the same can be said about the security issues that are based on using predictive analytics.

6. Changing business models

Peer to peer solutions continue to be the rage but with the capabilities listed above, whole new approaches to value generation are possible. There will always be early adopters who are willing to play with these and with the deeper understanding possibilities today new approaches to crossing the chasm will be demonstrated.

It should be an interesting year…