Recent Posts

Monday, Oct 13, 2014
Monday, Oct 6, 2014
Saturday, Sep 27, 2014
Friday, Sep 19, 2014
Saturday, Sep 13, 2014
Friday, Sep 5, 2014
Tuesday, Aug 26, 2014
Thursday, Aug 21, 2014
Monday, Aug 4, 2014
Wednesday, Jul 9, 2014
Wednesday, Jul 2, 2014
Wednesday, Jun 25, 2014
Friday, Jun 20, 2014
Friday, Jun 13, 2014
Friday, Jun 6, 2014
Wednesday, Jun 4, 2014
Friday, May 30, 2014
Friday, May 23, 2014
Friday, May 16, 2014
Friday, May 9, 2014
Thursday, May 8, 2014
Friday, May 2, 2014
Thursday, May 1, 2014
Thursday, Apr 17, 2014
Wednesday, Apr 16, 2014

Subscribe by Email

Your email:

GLEW'S NEWS BLOG

Current Articles | RSS Feed RSS Feed

Semiconductor Test Equipment Supplier Reaches Huge Milestone

 
Advantest V93000V93000 Test System (image provided by Advantest)

Advantest, a powerhouse in the semiconductor test equipment arena, recently announced that it had shipped a milestone 1000th V93000 test system [1].  This really is a milestone in an industry that moves so fast that most pieces of capital equipment seem to have lifespans shorter than that of a mayfly.

V93000 Steps Onto the Scene

Verigy introduced the V93000 in 1999.  In those days I was running the engineering group at Electroglas, a big wafer prober company.  Intel was our major customer and keeping them happy seemed to be my primary job function.  We were called into a meeting and were shown the V93K probe card for the first time.  It was a monster compared to typical probe cards. 

A probe card is the electromechanical interface between a device tester (like the V93K) and the IC devices to be tested on a wafer.  It is, in essence, a heavy-duty printed circuit board which routes signals from the device tester to a grid of microscopic needles (probes) which can then contact the connection points on an IC when it is still on the wafer and not yet packaged.  Devices can be tested electrically and bad devices can be detected before they are diced up and placed into fairly expensive packages saving the manufacturer time and money.

The number of tester pins it has to connect to dictates the size of a probe card.  For a long time, an 8-inch diameter probe card had plenty of real estate for connecting to the tester.  With an 8-inch probe card one could route 128 test lines to the probe array and that seemed like enough for the time.  8-inch probe cards had been around for several years and all the wafer prober companies had built their systems around the expectation of mechanically interfacing to them.  Verigy saw the future, however, and knew that with the transition of the semiconductor industry to 300mm wafers would come larger and larger devices with bigger and bigger pin counts.

V93000s Impact on the Semiconductor Industry

The V93K test head was massive.  Its probe card was almost 12 inches in diameter and could handle more test pins which was very important to Intel since the future to them was bigger and faster microprocessors with many more connection points.  The impact to Electroglas and every other prober company was the need to mechanically interface with the monster probe card.  We were being told in no uncertain terms that our new model tool would have to accept the V93K probe card or Intel would not buy it.

It was an exciting time for me personally because I got to preside over and direct the system specification and design for a generational change of our tool.  There were many challenges to grow a system, which had been dealing with 200mm wafers and 8-inch probe cards and make it handle 300mm wafers and 12-inch probe cards.  Which core technologies could we keep and which had to go?  How quickly could we produce a system?  Would I ever see my family again?  Verigy had made a bold move to a new standard in test and the knock-on implications rippled throughout the industry.

Why is the V93000 Still the Top Choice

For more than 15 years, Verigy (now since acquired by Advantest in 2011) has pushed the envelope in the world of high-speed test.  They increased the pin density of their tester and the top speed of the electronics again and again to keep the V93000 system fresh and relevant.  Their commitment to innovation and the longevity of their flagship test system can be an example to us all.

The semiconductor industry is now poised to transition yet again to 450mm diameter wafers and exactly the same challenges face the whole test and processing equipment industry.  If you are not critically examining your company’s offerings with an eye to 450mm you should be.  If you could use a system-level architect who has been through this process numerous times to help specify and manage the development of your next big thing, perhaps we can help.

https://www.advantest.com/US/News/ADVP008934

For more information on Glew Engineering Consulting visit the Glew Engineering website, blog or call 800-877-5892 or 650-641-3019. 

Mechanical Engineers Develop Squishy Robots

 
robots

 

The robotics industry is constantly taking advantage of new technology and thus is constantly growing and evolving.  The robots of today are no longer these big clunky machines that are not only dangerous to operate, but also dangerous to be around in general.  In fact, robots now not only work side-by-side with humans, but also are used to effectively perform medical procedures on people.  While all these advancements are amazing, people in the industry are constantly searching for ways to move the field forward.  This week mechanical engineers at MIT created a new material made from polyurethane foam and wax, which may find application for "soft" robots.

New Squeezable Material

MIT mechanical engineering professor Anette Hosoi and her former graduate student, Nadia Cheng, alongside researchers at several different institutes and universities have developed a new material that could allow robots to "squeeze through small spaces and then regain their shape" (Thilmany, 2014).  This advancement would be a huge step for the robotics industry, which is constantly striving to reduce the size of robots, and make them able to get into hard-to-reach areas.  

This new material creates many new possibilities for how robots could be used in the very near future.  In the past, metal, plastic, wood, or composites have been the primary materials used for constructing robots.  The one thing these all have in common is that while they are extremely tough and durable, they are only minimally flexible.  This new material "made from wax and foam is capable of switching between hard and soft states" (Thilmany, 2014). 

In order to even start this process, the researchers needed to trouble-shoot how they were going to create a soft material that was still controllable (a necessity when working with robots).  They were able to accomplish this by "coating a foam structure with wax" (Thilmany, 2014).  As we all know, foam can be easily squeezed into small spaces making it the perfect candidate for such an ambitious task.  Foam also has the ability to bounce back to its original shape and size after been squeezed into tight spaces or shapes.  The benefit of using wax is that it has a relatively low melting point, and is easily cooled.  According to Hosoi, "running a wire along each of the coated foam struts and then applying a current can heat and soften the surrounding wax".  Wax is and adaptable material.  If fracturing occurs, the wax can be reheated and then cooled, and the structure returns to its original form.  This provides some room for error without costing a fortune to repair what is already an expensive robot.     

Building a New Robot

 The process of building this new "Squishy Robot" began when "researchers placed a polyurethane foam lattice in a bath of melted wax, they then squeezed the foam to encourage it to soak up the wax" (Thilmany, 2014).  The foam works similar to a sponge in that it can absorb liquids.  This still makes one wonder though how the wax remains inside the foam lattice after it has been heated.  

This clearly was something that came up in their research because on the second version of the foam lattice a "3D printer was used to allow them to carefully control the position of each of the struts and pores" (Thilmany, 2014).  This made the printed lattice more controllable than the original polyurethane foam model, but it also increases the cost. While the first version works, the printed version has the ability to be modified and refined through test analysis. 

What Will They Be Used For

With a robot that can squeeze into tight spaces and then regain its original shape, the possibilities of its use seem nearly endless.  I could see them being used by Police Departments to disable bombs, which would eliminate putting our officers in the line of fire. Another possibility that the engineers at MIT believe possible is having this "soft" robot used as a medical device to "move through the body to reach a particular point without damaging organs or blood vessels along the way" (Thilmany, 2014).  I can't wait to see what role this new squishy robot plays in our future.

https://www.asme.org/engineering-topics/articles/robotics/squishy-robots?cm_sp=Home-_-HomeContent-_-Squishy-Robots

For more information on Glew Engineering Consulting visit the Glew Engineering website, blog or call 800-877-5892 or 650-641-3019. 

Thermal Analysis Devices Just Got More Affordable

 

 

Crop Picture resized 600 Image: Finney County in Southwestern Kansas is now irrigated cropland where once there was short grass prairie.  NASA IR image with false color.  Photograph Credit: NASA/GSFC/METI/Japan Space Systems, and U.S./Japan ASTER Science Team

Current Uses of Thermal Analysis Devices

One of the benefits of our space program (apart from TANG®) has been the development of Infrared Detector Technology (IR).  Various thermal analysis cameras that can see from the near IR (around 800-1200 nm) to far IR (8-12 um) depending on their detector technologies have been a part of many public and not so public satellite programs that observe everything from crops, to images of your city, to Homeland Security-related stuff for decades.

The government has pumped money into IR sensor technology through various agencies and we all get to benefit as the results get to market.  We can't get our hands on the super-secret defense cameras yet, but there are some cool new things coming to Amazon real soon. Thermal analysis cameras will soon be available for purchase by consumers.

My Work With IR Cameras

I have worked on IR microscopy and thermal imaging systems and analysis for years in order to see into the workings of semiconductor devices.  The systems I have worked on are complex combinations of high-accuracy motion systems, specialized optics such as Solid Immersion Lens (SIL) technology, and in the case of the most recent system, I architected a full wafer-level prober integrated with the diagnostic tool so that testing could be done at the wafer level.  

Those interested in that system can see a paper I presented at the IEEE Semiconductor Wafer Test Workshop in 2012:

http://www.swtest.org/swtw_library/2012proc/PDF/S08_02_Portune_SWTW2012.pdf

It turns out that silicon is largely transparent (depending on doping) to near IR wavelengths. This allows for some really interesting diagnostic opportunities.  If you could see in the near infrared region and looked at the backside of a chip as it operates you would see what looks like a cityscape at night from space and depending on the magnification of the optics you could see all the way down to a single transistor blinking as it switches.  Such transitions are visible because as a transistor switches it passes briefly through its linear region and emits a few photons of IR energy.

Static, bright spots can be heat signatures from power dissipation like shorts or heavy current draws.  Blinking spots result from the ON-OFF-ON transitions of flip-flops as each transistor slides briefly through its linear region on its way to a stable state.  With the right magnification optics it is possible to zoom in on individual cells and look for logic faults, stuck-at faults and crosstalk effects that result from subtle design rule violations.  If a system adds an IR laser, it can stimulate the circuitry and then changes in operating behavior can be seen.  The world of semiconductor failure analysis (FA) owes a lot to these systems.

The heart of all these systems, from diagnosing bad ICs to seeing bad guys at night from space is the IR camera.  They have always been very expensive (our system camera is in the 10's of thousands of dollars) and in order to get decent S/N on the image they typically need to be cooled.  The best such cameras have traditionally used liquid nitrogen to get the sensor down to around 70K.  One of the big names in IR sensor camera technology in the U.S. is Raytheon.

IR Imaging Comes to Consumers

According to a recent journal publication from Raytheon http://www.raytheon.com/newsroom/technology_today/2014_i1/nextgen.html, new breeds of IR sensors that do not require cooling are becoming available.  Although sensors from Raytheon have traditionally been produced in very expensive, and very low quantities, Raytheon has partnered with Freescale Semiconductor to make these devices in mass quantities.

This means that the consumer can have a useful, lowcost thermal imaging camera system.  Just this week, Seek Thermal http://www.obtainthermal.com, a Santa Barbara-based startup made a $199 IR Camera/Sensor accessory for smartphones available for purchase.  Their website illustrates some intriguing applications for the camera in a consumer environment.

Specialized, high cost IR camera systems will continue to have a place in industry.  When you need to see individual photons and resolve spots down to the sub-micron level, only the most cutting edge camera will do.  For those of us in the industrial world, we can complete the circle by thinking of things to do with a really low cost IR camera in the factory.  For the price of one so-called industrial camera you can perhaps network 20 or so cheap ones and get better results.  Personally, I have a few ideas that I plan to pursue.  Stay tuned.

For more information on Glew Engineering Consulting visit the Glew Engineering website, blog or call 800-877-5892 or 650-641-3019. 

Engineering Consulting Firms and Hollywood Share a Common Bond

 

Godzilla 

Most people would never think that an Engineering Consulting Firm would have anything in common with Hollywood, in reality they both share a common bond in breathing new life into something outdated and making it relevant today.

Godzilla Gets a New Look Thanks to Technology

My wheels began turning on this subject after recently reading in the San Jose Mercury News about a 4K ultra high resolution restoration that is underway in Japan of the Godzilla film franchise.  Although the appeal of an incredibly crisp, fully restored version of the 1954 classic is certainly there for a nerdy baby boomer like me, my takeaway from the article was focused more on the connection I felt towards what they were trying to accomplish.

In the article, the restoration team was quoted as saying that the scanning technology they are using is so good that they are discovering detail and nuance in the original source material that has never been seen since the film was made.  The original transcription and projection technology was just not up to the challenge.  Thus, the amazing depth and contrast resolution, that has been laying hidden in the silver nitride crystals of the old film stock can only been seen (wires and all!) by today's technology.  In essence, it has taken a new team of skilled technical people, armed with new technology to reveal the hidden features and thus breathe new life into a very old product.

I see the film restoration process as an excellent analog to the process that an experienced engineering consulting firm can bring to a company's established products.  It is something that I have been doing as an experienced systems electrical engineer for years.

Engineers are Skilled at Revamping Products

So, why use an outside firm to "restore" an old product?  There are many reasons, but here are a few:

  • Fresh Eyes:  The consulting team can experience the product in a fresh way, and like the film restoration team, uncover the hidden detail and design intent of the product.

  • Different Skillsets and Experience Base:  My personal consulting EE experience range includes electronic design, robotics, wafer probing, surface metrology, infra-red microscopy, cleanroom technology, vacuum transport, and front end systems.  When you add the other SERVICES of my firm to the mix, as a engineering team we are ideally placed to evaluate old implementations and propose new and novel ways to skin the original cat (or dinosaur).

  • No Axe to Grind:  An outside team is not influenced by office politics or the pet projects of an in-house team.  The consulting engineering team can work with in-house resources to get at the original design intent in the same way that the film restoration team uncovers the director's vision.  They can also be objective and provide alternative implementation proposals, often bringing new technology from other fields into play to reduce cost, replace obsolescent designs, add features, and thus breathe new life into the old beast.

I am sure that the restoration team in Japan feels both excited and humble at their great undertaking.  I share those feelings every time my team takes on a new challenge.  It is the reason why I keep at it after many years in the business and look forward to hearing the "roar" of the finished project as it takes on the world-all over again.

http://www.mercurynews.com/entertainment/ci_26432609/godzilla-stomps-back-ultra-hd-wires-intact

For more information on Glew Engineering Consulting visit the Glew Engineering website, blog or call 800-877-5892 or 650-641-3019. 

Materials Science News: 2-D Phosphorus-The Future for Solar Cells?

 

Semiconductor 

Like most industries, the semiconductor industry is not impervious to economic high and lows.  After having a few rough years the industry is recovering, and along with this recovery, a wealth of development and development dollars have been spent.  This week materials science researchers announced that 2-dimensional phosphorus could be part of the future for the semiconductor industry.  One theory is that 2-dimensional phosphorus could eventually replace the more commonly used silicon; how will this affect the future of semiconductors?

Silicon in Semiconductors

Silicon atoms (specifically in crystalline form) are able to create perfect covalent bonds with each other.  This means that once the bond is made, the atom does not gain or lose electrons easily.  When four silicon atoms bond with each other they form what is called a lattice.  Pure silicon crystals are naturally an insulator, and do not allow much electricity to flow through it.  It is possible to change the behavior of silicon by doping it.  Doping is when you add a small amount of impurity into the silicon, which destabilize the covalent bonds.  There are two different types of doping that are done to silicon:

  • N-type: (When phosphorus or arsenic are added) creates a good negative conductor

  • P-type: (when boron or gallium are added) creates a good positive conductor

Adding either an N-type or P-type dopant turns silicon from a good insulator into a good (not great) conductor, and therefore creates a semiconductor.  While both the N-type and P-type doping is not novel, when they are together they create a diode (simplest semiconductor device).  A diode allows a current to flow in one direction but not the other.

New Research

While phosphorus is not in the same group as silicon or carbon (see periodic table) [1], materials scientists at Rice University have found it to be a promising candidate for "Nano-electronic applications" that require stability [2].  Now to be clear this is not the common element phosphorus.  Rather it is a "two-dimensional phosphorus, [made] through exfoliation from black phosphorus" [2].  Black phosphorus is believed to be the most stable form of phosphorus.  It is created when phosphorus is put at "higher temperatures about 590 °C and higher pressures" or when phosphorus is combined with a "catalyst at ordinary pressures and a temperature of about 200°C" [3].

Researchers at Rice University compared 2-dimensional phosphorus with other 2-dimensional metal dichalcogenides like molybdenum disulfide because of their inherent conductive properties (metals are natural conductors).  Issues have arisen, however, where these other compounds bond-the point where the elements meet (point defect).  A disturbance is created in the flow of the current.  In doped silicon, this doesn't occur because the negative and positive silicon work together to fill in these gaps therefore eliminating a disruption in flow.  When there are "multiple point defects or grain boundaries-where the sheets of a 2-D material merge at angles" the device is no longer useful [2].

Advantages of Phosphorus

2-dimensional phosphorus does not exhibit the same issues at the point defects that other materials tested experienced.  According to calculations done by theoretical physicist Boris Yakobson and his colleagues at Rice University, the point where 2-dimensional phosphorus point defects or grain boundaries exist, the materials semiconducting properties remain stable.  This transpires at the point defects because "atoms jut out of the matrix, this complexity gives rise to more variations among defects" [2].  Also, 2-D phosphorus bonds with itself, this therefore eliminates the recombining of electrons that occurs between hetero-elemental bonds.  2-dimensional phosphorus is very similar to 3-dimensional silicon because they both don't have issues with band-gap changes at ground boundaries.  The key difference however between the two is that 3-dimensional silicon can change its properties from positive to negative at the point defects, and this does not occur in phosphorus.  Another benefit of 2-dimensional phosphorus is that phosphorus exists in abundance on Earth, and the black phosphorus is relatively easy to make.  No production worthy semiconductor equipment available yet for this material. 

Future of Phosphorus Semiconductors

The researchers at Rice University believe that 2-dimensional phosphorus semiconductors could potentially be used to harvest sunlight in solar cells because their band-gap matches well with the solar spectrum.  Due to the way this new phosphorus responds at the point defects, the materials performance would not deteriorate as it has with other materials tested [2].  This is great news for the solar industry that is constantly looking for new ways to improve their products and make them more durable and efficient. 

2-dimensional phosphorus has already been tested in "high-performing electronics, and has already shown it can be a better transistor than 2-D metal dichalcogenides" [2]. 

So far the future looks bright for the use of 2-dimensional phosphorus in semiconductors instead of silicon.  Semiconductors and their success effects our lives every day without people even realizing it.  Semiconductors are in all of our electronic devices, from our smartphones to the computers in our cars.  Their effectiveness is what keeps us connected in today's technology dependent society.  If phosphorus is the answer to fewer interruptions in our devices, then it will be welcomed with open arms because let's be honest nothing is more upsetting than when your smartphone malfunctions.

[1] http://www.mpoweruk.com/images/periodic_table.gif

[2] http://www.rdmag.com/news/2014/09/phosphorus-promising-semiconductor

[3] http://www.britannica.com/EBchecked/topic/68159/black-phosphorus

For more information on Glew Engineering Consulting visit the Glew Engineering website, blog or call 800-877-5892 or 650-641-3019. 

3D Printing Part 2: A Consulting EE's views on 3D Printing in Space

 

Space Image

I am a consulting electrical engineer (consulting EE) and this is part two in my series on 3D printing.  Today I will be discussing the possibility of 3D printing making its way into space, and why I believe it is possible based on my experience with 3D printing.

3D Printer Headed to Space

3D printing seems to be everywhere these days from people's living rooms to Office Depot, and now it seems space is the next stop.  According to NASA they are planning on sending a 3D printer to the International Space Station (ISS).  They have already done some preliminary experiments on the "vomit comet" airplane, which gave enough success to let them move into the next phase of experiments.  Missions 41/42 and 43/44 will be starting in September 2014 and proceeding into 2015.

3D Printing Process

I believe that there is no reason that 3D printing would not work in zero-G.  The process does not depend on gravity-Raw material (filament) is mechanically pushed into a heated chamber which terminates in a nozzle.  It is then extruded in a thin bead and the extruder is moved to lay down the pattern on each layer.  Each successive layer melts into the preceding one and thus sticks where it is placed.

The first layer is the tricky part.  It has to adhere to a bed and this is a universal problem for all 3D printers to solve.  The extruded material has to stick to the bed just enough to hold it in place both during the printing process and also while it cools.  The adhesion has to resist the tendency of the material to shrink as it cools.  Not enough "stick" and the first layer shrinks unevenly on its long axis and curls away from the bed.  Too much "stick" and the unfinished piece cannot be removed from the bed without damage to the piece or bed.

Lots of experimentation is going on to try and achieve a reliable, repeatable bed surface.  There are many hobby solutions and some serious materials science is also happening to find just the right coating for the perfect stick/release surface. 

For more information about a kick starter-funded group that is making some inroads into the solving the problem see the URL: 

http://www.geckotek3d.com/

Benefits of 3D Printing in Space 

However it is achieved, the first layer is extruded onto a bed surface and adheres temporarily, without bonding.  There is a ponteintal advantage to printing in a zero-G environment because the issue of bridging large gaps with molten filament is not a problem.

The traditional issue with gaps is that the extruded material is hanging unsupported as the nozzle travels over a gap.  Imagine a rope suspended over a chasm.  Gravitational forces may cause the viscous material to droop.  The resultant droop in the hot filament material comes about from its viscous state when it leaves the extruder nozzle.  It solidifies as it cools but by then the damage is done and you no longer have straight lines of material over gaps.  In space, this problem goes away, at least in theory.  Perhaps it will be replaced by another problem as the extruder material has some inertia when it leaves the nozzle.  That is something we can learn when the printer gets up there. 

http://www.nasa.gov/mission_pages/station/research/experiments/1115.html#results

 

For more information on Glew Engineering Consulting visit the Glew Engineering website, blog or call 800-877-5892 or 650-641-3019. 

3D Printing Part 1: First attempt by a consulting EE

 
My experience with 3D Printing

I am a consulting electrical engineer (consulting EE), and want to share my first attempt in the world of 3D printing.  Last May I acquired a SeeMeCNC “Orion” delta-style printer at the Maker Faire in San Mateo, California.  Since then I have used three pounds of plastic filament and printed many terrible failures on the road to some beautiful components.  Figure 1 shows an example of a gear from a gear cube that I designed using Solidworks (tm).  The blue part is an early print and is very rough.  The red part was printed after I adjusted the process.  Figures 2-6 show the evolution of a vase throughout the 3d printing process.

early attempts at a complex gear compared too later attempts
 Figure 1: Gears From rough (blue) to smooth (red)

beginning stages of a 3d print of a vase
Figure 2: A 10-hour print run of a vase. 

10 hour 3d print of a vase

 Figure 3: A 10-hour print run of a vase further along in printing.

almost completed 3d print of a vase

Figure 4: A 10-hour print run of a vase almost completed.

3d printing of a vase

Figure 5: A 10-hour print run of a vase.

vase

Figure 6: A completed vase.

The control of 3D Printing

Despite the advances made by countless experimenters, hackers, and hard-core engineers in the field, 3D printing is still in its infancy.  As a hobby, it is comparable to the very early days of personal computers (remember the IMSI 8080?) in which useful results could be obtained but only if you were willing to do a lot of very manual stuff.  As a business, it is not yet plug-and-play, and I have a sneaking suspicion that companies who offer printed parts for hire make a fair bit of scrap that the end customer never sees.  I look forward to more prototyping with 3D printing.

My electrical engineering career has been tied to the semiconductor equipment industry for many years so I am no stranger to process control.  In a semiconductor fabrication factory (FAB), the ability to diagnose, measure, and control fairly complex processes determines ones success.  Tiny variations in gas flow rates, annealing temperatures, etch time, and a hundred other factors can be the difference between a wafer full of pricey graphic processing units (GPUs) and one that is the failure analysis (FA) lab’s worst nightmare.

In my attempt to master the 3D printing process I have had to bring my process control and continuous improvement experience to bear and work out a series of experiments to help me “dial in” my printer.

Deming Circle

Figure 7: The Deming Circle – Classic Continuous Improvement cycle

 

3D Printing Process Variables 

This may seem like a bit of overkill for a “hobby” but is it ingrained in my electrical engineering DNA and I know that careful planning, with incremental change experiments and careful examination and analysis of the results will yield better and better outcomes.  Good results are all about the process control.

There are many process variables that affect the quality of a 3D print.  Like any real-world system, they are interrelated; no single parameter can be changed without having a ripple effect on other parameters.  I have been experimenting in a careful manner with each parameter, a little at a time and printing and re-printing test models in the same fashion I would for a consulting client.  I have designed simple geometric shapes in computer-aided design (CAD), which stress a particular feature or function of the print.   In future posts I will touch on more of them in more detail, but for now, here are some of the “high nails” of the process.

Extrusion Temperature

There is no real standard for the purity or content of any of the plastic filament available today.  As a result, the melt point, glass transition point, and other physical properties of plastic filament will vary from batch to batch and color to color.  The range can be as much as 20 - 30° C!  Too hot and the plastic will dribble out of the nozzle like a bad head cold and too cool and the extruder motor will be unable to push the filament through the nozzle fast enough to give consistent flow.

Flow Rate

3D printers are “dead reckoning” systems.  They depend on stepper motors to drive filament through a heated extruder nozzle and have to guess at how much plastic is coming out.  Clever software calculates expected flow based on filament diameter and commanded filament speed, but there is no feedback in these systems to make adjustments.

Layer Height vs. Extrusion Diameter

In each printed layer, a ribbon of molten plastic is extruded from a nozzle of given diameter.  Each layer sits on top of a previous layer and is flattened slightly based on the height of the nozzle above the previous layer.  Too close and the layer deforms as it is extruded.  Too far away and it may not adhere to a previous layer.  These things are a major factor in surface finish and strength of the final part.

I have learned a lot over the past few months and continue to learn from experiment and collaboration with the vibrant community of owner/experimenters here in the SF-Bay area and silicon valley.  The RepRaP Wiki is an incredible source for information on the general 3D printing subject.  Presently, my success rate is about 70-80% and thus the experiments continue in between prints of artistic or functional pieces.  This is a journey in which my engineering background compliments my hacking spirit.  More to come in the following series on 3D printing.

For more information on Glew Engineering Consulting visit the Glew Engineering website, blog or call 800-877-5892 or 650-641-3019. 

The Need For Engineering Heroes

 

Hero

Recently IEEE writer, G. Pascal Zachary, wrote an article, Where are Today’s Engineering Heroes?  This article describes the lack of engineering heroes in today’s society.  Not only is celebrating heroes a good way to inspire young people and inform the public it is also necessary.  The lack of heroes negatively affects engineering because it diminishes the enterprise in the public eye and constricts the flow of talent into the field.  In a society that hero-worships rock stars and movie stars, serious fields are lacking serious heroes. 

While many would argue that there are plenty of engineering heroes in today’s society: Hewlett and Packard, Steve Jobs, or Bill Gates, those individuals are celebrated mostly for building huge corporations based on the technology created and developed by many.  Basically, the engineers who earn the most fame make the most money.  So, that would lead others to believe that in order to be a hero you must first amass a fortune.  While Zachary states there is nothing wrong with profiting from your ideas, it shouldn’t be the sole marker for a hero in the industry. 

Zachary believes that engineering may be lacking heroes because many truly do not understand the work of engineers anymore.  When Edison created the phonograph in 1877 everybody could relate to the invention.  However, today when an engineer designs a microprocessor with 2 billion transistors instead of 1.5 billion, your average individual does not understand the significance.  Zachary also believes that engineers face a structural impediment since there is no Nobel Prize for engineering, nor is there an engineering award with similar global status and prestige.  While a few engineers have received the Nobel Prize in other fields, without a Nobel of their own, engineers cannot anoint their heroes in the same way physicists, economists, or authors can.  While engineering does have the Kyoto Prize in Advanced Technology, the Charles Stark Draper Prize of the U.S. National Academy of Engineering, and the IEEE Medal of Honor, none of these awards have the same prestige or are as well-known as the Nobel Prize.  Zachary also believes these awards underscore the abiding stereotype that engineers are solely male.  Only one of the 34 recipients of the Kyoto Prize in Advanced Technology and one of the 47 recipients of the Draper prize has been a woman.  Also, of the 95 people that have received the IEEE Medal of Honor award, non-have been women. 

Zachary questions what it takes to become an engineering hero.  He believes that overcoming adversity – whether personal, institutional, or technological – is a valid criterion.  For example, computer scientist Grace Hopper, developer of the first compiler, beat all three.  She succeeded in a male dominated field and institution while shaping the course of computer programming and reaching the rank of rear admiral in the U.S. Navy.  Contribution to the social and cultural well-being of humanity is another criterion for engineering heroism in Zachary’s eyes.  However, throughout engineering history, people have sought to solve technological issues because they were there, not necessarily because they were considering the greater good.  However, many of these inventions did results in benefits for humanity.  For example, mechanical engineer Jacob Perkins created the first refrigerator.  While his invention was far from the refrigerators we know today, it is because of his work that countless lives were saved.  Before the refrigerator foodborne illness and death were a common headline.  If Jacob Perkins isn’t an engineering hero than I don’t know what is.

Zachary then continues by tackling the question: Can heroism be taught, or is it innate?  He strongly believes that heroes are made, not born.  They learn from their experiences, react to opportunities and setbacks, and when others stay in the safe zone, they reach into the grey area searching for something more.  By reaching into the grey area, engineering heroes achieve “charismatic authority”, or the ability to influence, inspire, and lead others, a phrase coined by German sociologist Max Weber.  Charismatic authority does not just apply to those who gain outsize status through media acclaim.  Charismatic engineers can also work on an intimate level by influencing their peers behind the scenes or by challenging the norm through their inventions or designs.  “The history of engineering is replete with examples of unheralded engineers who refused to accept designs that compromised the public welfare, no matter how profitable they were,” said historian Matthew Hersch. “Inventions like the safety match and the safety bicycle not only worked better than their predecessors, but more ethically. To me, the creators of these technologies are the real heroes.”

The most accomplished engineers have tried and failed many times in their careers.  While many know who Cerf and Kahn are, most have not heard of Louis Pouzin.  Pouzin, the creator of an early packet-switching network called Cyclades, envisioned the democratizing potential of computer networking.  In 1975, Pouzin and Cerf led a group that attempted to get a packet-switching standard adopted by the International Telegraph and Telephone Consultative Committee.  Pouzin publicly criticized the telecom industry’s conservatism and shortly thereafter saw his funding and career opportunities diminish.  Cerf and Kahn utilized aspects of Pouzins’ ideas into the TCP/IP design for the Internet.  Decades later, Pouzin is finally receiving some recognition for his contribution.  None of these engineers worked alone, and their accomplishments occurred in parallel with the efforts of others.

While the engineering community values modesty and suspects that promotion conceals distortion or even fraud, Zachary truly believes that heroes and heroism are essential for engineers to gain respect and acknowledgement for their activities and technological developments.

http://spectrum.ieee.org/geek-life/profiles/where-are-todays-engineering-heroes

For more information on Glew Engineering Consulting visit the Glew Engineering website, blog or call 800-877-5892 or 650-641-3019. 

Materials Science Engineering make a more Energy Efficient Fuel Cell

 

 

Hydrogen Fuel Cell

 

 

While renewable energy sources help to fight the effects of global warming, they do have their drawbacks.  Renewable energy cannot be produced as predictably as plants powered by oil, coal, or natural gas.  Ideally, alternative energy plants would be paired with a huge energy storage system that would store and dispense power.  Stanford School of Engineering is working to use reversible fuel cells to combat this storage issue.  Fuel cells use oxygen and hydrogen to create electricity; if the process were reversed, the fuel cell could be used to also store electricity.

"You can use the electricity from wind or solar to split water into hydrogen and oxygen in a fuel cell operating in reverse," said William Chueh, an assistant professor of materials science and engineering at Stanford and a member of the Stanford Institute of Materials and Energy Sciences at SLAC National Accelerator Laboratory. "The hydrogen can be stored, and used later in the fuel cell to generate electricity at night or when the wind isn't blowing."

Fuel cells are not a perfect solution.  The chemical reactions that cleave water into hydrogen and oxygen or join them together are not completely understood – at least not to the degree necessary to make utility-grade storage systems.  Chueh is working alongside researchers from SLAC, Lawrence Berkeley National Laboratory and Sandia National Laboratories to study the chemical reactions in fuel cells in a new way.  In an article published in Nature Communications, Chueh and his team describe how they observed the hydrogen-oxygen reaction in a specific type of high-efficiency solid-oxide fuel cell.  They also took atomic-scale photos of the process using a particle accelerator called a synchrotron.  This type of analysis is first-of-its-kind and help lead to more efficient fuel cells that could eventually allow for utility-scale alternative energy systems.

Electrons Role

In a traditional fuel cell, a gas-tight membrane separates the anode and cathode. Oxygen molecules are introduced at the cathode where a catalyst fractures them into negatively charged oxygen ions.  These ions then make their way to the anode where they react with hydrogen molecules to form the cell's primary "waste" product: pure water.  To perform these reactions, electrons also need to make the journey.  Normally, the electrons are drawn to the cathode and the ions are drawn toward the anode, but while the ions pass directly through the membrane, the electrons can't penetrate it; they are forced to circumvent it via a circuit that can be harnessed to run anything from cars to power plants.

Because electrons do the designated "work" of fuel cells, they are thought of as the critical functioning component. But ion flow is just as important, said Chueh.

"Electrons and ions constitute a two-way traffic pattern in many electrochemical processes," Chueh said.  "Fuel cells require the simultaneous transfer of both electrons and ions at the catalysts, and both the electron and ion 'arrows' are essential."

Electron transfer in electrochemical processes such as corrosion and electroplating is relatively well understood, Chueh said, but ion flow has remained unclear.  This is due to the environment where ion transfer may best be studied -- catalysts in the interior of fuel cells -- is not conducive to inquiry.

Solid-oxide fuel cells operate at relatively high temperatures.  Certain materials are known to make superior fuel cell catalysts.  Cerium oxide, or ceria, is particularly efficient.  Cerium oxide fuel cells can hum along at 600 degrees Celsius, while fuel cells incorporating other catalysts must run at 800 C or more for optimal efficiency.  Those 200 degrees represent a huge difference, Chueh said.  "High temperatures are required for fast chemical reactivity," he said.  "But, generally speaking, the higher the temperature, the quicker fuel cell components will degrade.  So it's a major achievement if you can bring operating temperatures down."

How Does It Work

While cerium oxide established itself strong catalysts for fuel cells, it is unclear why it works so efficiently.  What were needed were visualizations of ions flowing through catalytic materials.  But putting an electron microscope into the pulsing, red-hot heart of a fuel cell running at full bore isn’t exactly possible.  "People have trying to observe these reactions for years," Chueh said.  "Figuring out an effective approach was very difficult." 

In their Nature Communications paper, Chueh and his colleagues at Berkeley, Sandia and SLAC split water into hydrogen and oxygen (and vice versa) in a cerium oxide fuel cell.  While the fuel cell was running, they applied high-brilliance X-rays produced by Berkeley Lab's Advanced Light Source to illuminate the routes the oxygen ions took in the catalyst.  Access to the ALS tool and the cooperation of the staff enabled the researchers to create "snapshots" revealing just why ceria is such aFuel Cell superior catalytic material: it is, paradoxically, defective.  "In this context, a 'defective' material is one that has a great many defects -- or, more specifically, missing oxygen atoms -- on an atomic scale," Chueh said. "For a fuel cell catalyst, that's highly desirable."

Such oxygen "vacancies," he said, allow for higher reactivity and quicker ion transport, which in turn translate into an accelerated fuel cell reaction rate and higher power. 

"It turns out that a poor catalytic material is one where the atoms are very densely packed, like billiard balls racked for a game of eight ball," Chueh said. "That tight structure inhibits ion flow. But ions are able to exploit the abundant vacancies in ceria. We can now probe these vacancies; we can determine just how and to what degree they contribute to ion transfer. That has huge implications. When we can track what goes on in catalytic materials at the nanoscale, we can make them better -- and, ultimately, make better fuel cells and even batteries."

 

 

http://www.sciencedaily.com/releases/2014/07/140709095931.htm

 

 

For more information on Glew Engineering Consulting visit the Glew Engineering website, blog or call 800-877-5892 or 650-641-3019. 

New Class of Electronic Devices Could Come From 2-D Transistors

 

innovationEarlier this spring two separate research projects were building transistors made solely from two-dimensional (2-D) materials.  Argonne National Laboratory researchers described a transparent thin-film transistor (TFT) that they had created in the Nano Letters journal.  They used tungsten diselenide (WSe2) as the semiconducting layer, graphene for the electrodes and hexagonal boron nitride as the insulator.  A week later the ACS Nano journal published that researchers from the Lawrence Berkeley National Laboratory had also built an all 2-D transistor that took the shape of a field emissions transistor (FET).  The Berkeley Lab FET used the same materials for their electrode and insulator layers as Argonne’s TFT, but used molybdenum disulfide (MoS2) as the semiconducting layer.

While the fabrication of transparent TFTs made from 2-D materials could lead to flexible displays with super-high density pixels, the impact of an all 2-D FET could potentially have a broader impact.  FETs are nearly omnipresent, being used in computers, mobile devices, and many other electronic devices.

Issues with FETs prior to Berkeley Lab’s work has been that their charge-carrier mobility degrades because of mismatches between the crystal structure and the atomic lattices of the individual components, namely the gate, source and drain electrodes.  These mismatches result in rough surfaces and in some cases dangling chemical bonds.  The completely 2-D FET developed at Berkeley Lab eliminates this issue by creating an electronic device in which the interfaces are based on van der Waals interactions.  These interactions represent all the attractive or repulsive forces between molecules that are not covalent bonds, instead of covalent bonding.  "In constructing our 2D FETs so that each component is made from layered materials with van der Waals interfaces, we provide a unique device structure in which the thickness of each component is well-defined without any surface roughness, not even at the atomic level," said Ali Javey, a faculty scientist in Berkeley Lab's Materials Sciences Division.  He also said that the approach "represents an important stepping stone towards the realization of a new class of electronic devices."  By having interfaces based on van der Waals interactions instead of covalent bonding, it will be possible to reach a degree of control in material engineering and device exploration that has yet to be seen. 

 

http://spectrum.ieee.org/nanoclast/semiconductors/devices/transistors-made-from-2d-materials

For more information on Glew Engineering Consulting visit the Glew Engineering website, blog or call 800-877-5892 or 650-641-3019. 
All Posts

Current Articles | RSS Feed RSS Feed

Linear v Novellus (Semiconductor Equipment)

  
  

After 8 long years, Novellus finally rid itself of the lawsuit with Linear Technology. Irell and Manella LLP, for whom Glew Engineering has worked in the past, took no prisoners in the unanimous jury verdict announced yesterday in favor of their client Novellus.  The jury consisted of 12 men and women in Santa Clara, CA, the heart of the silicon valley.  Certainly good news for Novellus' legal team, as well as their bottom line. Congratulation to Jonathan Kagan Esq. and his colleagues.  Now both sides can get back to what they do best - making chips and chip equipment.

Novellus' also shipped their 1000th Vector PECVD tool in February? Considering the tool's throughput and uptime, there may be as many chips out there by now with Novellus' dielectric films as those of any semiconductor equipment manufacturer. See the details at: 

http://ir.novellus.com/releasedetail.cfm?ReleaseID=441840

 

Semiconductor Equipment, Glew Engineering

Comments

Its a nice post read on the advantages of the solar energy.Thanks for posting it here.
Posted @ Tuesday, November 01, 2011 4:50 AM by Solar panels georgia
Post Comment
Name
 *
Email
 *
Website (optional)
Comment
 *

Allowed tags: <a> link, <b> bold, <i> italics

Follow Glew Engineering

Browse by Tag

Subscribe by Email

Your email: