Carbon Nanotube Quality Control System


This project was driven by a need for N12 to scale up quality control throughput to enable large scale production of carbon nanotubes.  One of the commonly measured properties of carbon nanotubes is their height; however, traditional measurement methods (SEM) are expensive, require extensive training, have low turnaround times and have a small sample size.

Our engineering team developed the concept of using a rastering laser scanner to measure a sample before and after the nanotubes were removed, where the difference gave the height of the nanotubes.  I was tasked with electrical and software integration, while my coworker worked on mechanical design.

Enclosure



One of the design constraints was that the system be as self contained as possible.  I decided to house everything in a single enclosure, with switches and indicators mounted on the outside.  Included are the sensor controller, stage controller, separate power supplies, terminal blocks and standard/timed/ pneumatic relays.  The switches and indicators, used to secure the sample during measurement, are hard wired.  All power and communication to the sensor and stage were done through two 24-pin connectors.

Software

Due to prior experience and time constraints, I decided to code the system in Python on a PC.  The software aspect of the project entailed two subsystems: data acquisition/processing, and a user interface.  I further bucketed the first sub-task into two main scripts: one to collect and store data from a scan, and another two take two scans and create a subtractive height map.

After writing byte message handling functions for the sensor and finding a driver for the stage, I created a first multi-pass script for collecting and storing data.  During the raster, measurement and position byte data would be captured and saved in a tuple on each iteration.  Then, the byte data would be translated and transcribed to a .csv file.  I decided to use .csv files as outputs, instead of passing the data directly from script to script, in order to simplify troubleshooting.

Because the "position" read from the stage during motion was actually a step count from an origin (not an absolute encoder), the measurements from two successive scans weren't taken at exactly the same locations.  To enable proper subtraction, the height at specific locations (for instance, every millimeter along the scan) had to be interpolated from the actual scan data (this was implemented with a NumPy interpolation function).  Finally, the output of this script was a .csv containing the height profile and other statistics.  Later on, I added some low pass filtering to remove spikes caused by dust particles settling on the substrate.


The user interface requirements were very basic: buttons to set naming IDs, start scans, and a list to display indicators (average, standard deviation) to the operator.  While the system also produced more detailed profiles as a .csv for later analysis, the operator could use the statistics as a cursory check.  I chose to use REMI for the user interface, which translates python code into javascript and runs the site on a local server, as it fit my criteria and eliminated a bunch of work.  REMI is event-based, which made it easy to integrate with my data collection and processing scripts.


Assembly, Testing and Validation

Once all the parts had arrived, my coworker and I spent a few weeks assembling, testing and validating the system.  The biggest software changes were to the user interface, after input from potential users about file naming conventions.  Aligning the pneumatic pistons that actuated the sample clamps also gave us some trouble, and ultimately required some shimming.

To ensure that the system provided meaningful data, I also coordinated a validation study using an SEM as ground truth.  This was modeled after a previous validation study we had done for a different height sensor, comparing measurements made on the same growth samples.  The result of the study was positive, indicating that the system was acceptable for use.




LiDAR Forest Mapping (Capstone Project)

Photo and point cloud of Ell Hall in Boston

The senior capstone project I worked on at Northeastern involved LiDAR forestry.  Our project was sponsored by a research lab that was interested in creating high resolution scans of entire forest canopies, something that wasn't possible with their ground-based system.  Our task was to deliver a portable system that could maintain a high point cloud density at all heights up to 15 meters.

Comparison of old system (left) and our project results (right)

We quickly split the project into two parts: a lift system, and an integrated LiDAR/controller package.  As the only team member with sensor integration experience, I took on the second part.  For the lift system, it was ultimately decided to purchase an off the shelf pneumatically actuating mast.  While the ascent was jerky (using a hand pump), the descent was smooth and slow - perfect as a platform for creating a scan.  The stability of the mast also gave the added bonus of not having to worry about translation or rotation about the x and y axes, simplifying the problem of point cloud reconstruction.

Operation of the pneumatic mast, with sensor package prototype on top

Our sponsor graciously provided us with a Velodyne puck and a VectorNAV inertial measurement unit (we wouldn't have had the money otherwise!) to use for the project.  Based on the UDP protocol of the LiDAR and memory requirements, I chose to purchase a raspberry pi for data collection and processing.  I also purchased a 12V battery pack and a buck converter, for field testing and eventual package integration.

Initial coms testing... on a tripod.

After writing functions to handle and parse the UDP and RS232 packets (LiDAR and IMU), I wrote a first version of the collection and processing script, which collected data and timestamps serially while also computing the cartesian coordinates and transforms, and finally outputting to a .csv file.  The test results were poor - most of the LiDAR packets were being dropped, and the vertical and rotational transformations were off.  In my diagnosis, I realized the runtime of the script (~50 Hz) was an order of magnitude slower than the incoming data packets (700 Hz and 400 Hz for the LiDAR and IMU).  Because each LiDAR packet contained 1000 coordinate sets, the number of mathematical operations required were too much for the raspberry pi to do in real time, leading to buffer overflows for the incoming packets and mismatched data.

Improvement in point cloud quality from single to multi pass

To remedy this issue, I decided to rewrite the script with multiple passes.  In the first pass, the byte packets from the LiDAR and IMU would be saved as a tuple in a list along with a corresponding timestamp.  In the second pass, the packets would be translated and the cartesian coordinates would be calculated, transformed with the integrated IMU data, and output as a .csv file.  The runtime of the collection script increased by 20x (1000 Hz), resulting in no lost packets and accurate positional transformations.

To mount and house the sensor and components, I worked with another team member to design an internal housing block for an off the shelf enclosure.  It included mechanical vibration and shock damping for the IMU, an ethernet knockout for in-field point cloud file access over FTP, and enough battery power for 20 scans.

Designs for (left) and actual (right) LiDAR sensor package




Differential Tension Measurement for Web Handling/Roll-to-roll

Differential Tension Measurement for Web Handling/Roll-to-roll


N12 grew carbon nanotubes on a continuous stainless steel ribbon under tension, and problems with web handling proved to be a large source of loss.  Among the issues resulting from uneven loading of the web were:
  1. Differential plastic deformation of the web at high temperature, causing a bow upon cooling
  2. Inability to maintain constant vertical distance from heating elements
  3. Web walk (side to side)
To address this issue, I was tasked with designing a system that could measure differential tension in a nonintrusive way, and that could be integrated into a tension control system down the road.

After writing a work plan, the first step was to do background research into the issue.  After doing some quick calculations, I determined that using a laser distance sensor to measure either the frequency or amplitude of vibrations in the web should yield a usable result.  The data taken in background document was from a Keyence sensor that I requested a demo for.



Raw data collected from the distance sensor

After reviewing the background work with my boss, we decided to move forward with a full measurement system.  Speccing the sensor came down to sample rate, which was based on the natural frequency calculations (around 1000Hz), the range, which was based on the amplitude observed during the initial testing with the Keyence sensor (a few millimeters), and the resolution, also determined empirically (10s-100s of microns).  I also put a lot of weight on the sensor having a fast, reliable communication protocol (e.g. Modbus TCP).  In the end, I chose the Wenglor PNBC002.

Finding a sensor that met these requirements meant shelling out a few thousand bucks, so we decided to switch from the initial plan of having two sensors looking at either edge of the web, to having a single sensor on a rastering stage.  The stage I selected was from Newmark Systems, and communicated over RS232 (however, offline testing was done with a manual stage, for ease).

Once everything was assembled and scripted (I used python for collection/processing/control, and Remi for GUI), I went about validating it in an offline setting.  This was done on test rigs for two widths (1" and 15", current and future) of stainless steel web.  The test rigs allowed for the application of uneven loads to the web, and a manual tensiometer was used as a rough gauge of the global (1") and local (15") tension.  The results of this was two reports showing the efficacy of the system.


 1" web test rig                                  15" web test rig





After offline validation, I had a mount block machined for the automated stage and incorporated the measurement into our data acquisition system.  Unfortunately, we stopped using this reactor (and abandoned roll-to-roll altogether) soon after finishing.

Display interface for integrated system



Carbon Nanotube Quality Control System

This project was driven by a need for N12 to scale up quality control throughput to enable large scale production of carbon nanotubes.  O...