Embedding the World Cup with goal-line technology

For years, international football association FIFA have heavily resisted technology’s influence in soccer, almost comically arguing that bad refereeing decisions are all part of the excitement of the game. FIFA president Sepp Blatter has described goal-line technology as “only 95 percent accurate”, though even that level of accuracy – when compared to a human eye, often tens of metres away – is surely a vast improvement?

For networking appliance technologists, even if this disputable 95 percent figure was to be believed, bridging that 5 percent gap was never a sizeable task. Though in 2008 following that statement, the FIFA president put the implementation of such technology on ice – permanently.

Predictably, subsequently further controversial decisions ensued, though in relatively low-key matches not on the international stage, and in March 2010 an election was held between eight of the founding bodies of soccer – voting 6-2 in favor of permanently ditching the technology, the two dissenters being England and Scotland.

In June that year at the 2010 FIFA World Cup the tide was about to turn, when hundreds of millions of fans across 241 separate countries saw England’s Frank Lampard score a goal – the ball clearly over a metre across the line – against Germany, which was disallowed due to human error by the referee. Scoring or missing was a turning point in the 2-1 game, which ended as a 4-1 loss for England. The entire embedded computer industry, quickly followed by immense global supporters!  Taking huge pressure on FIFA, and shortly after Blatter announced that the goal-line technology consideration would be re-opened.

The tech contenders
In 2011 FIFA began internal trials with 10 companies’ goal-line embedded system technology, and by 2012 they whittled this down to two potential candidates: Goal Ref, utilizing a passive “chip-in-ball” and a magnetic field to detect its whereabouts; and Hawk-Eye, utilizing a series of high-resolution cameras and triangulation algorithms.

Both have a very high, though interestingly unpublished, accuracy percentage, but neither could claim 100 percent accuracy as both are fallible to some degree.

Through networking appliance technology based on electromagnetic fields, which is being used at the 2014 World Cup, it would be susceptible to interference an unscrupulous party could theoretically interfere with its accuracy.

The high-speed-camera-based system, you could argue, is less vulnerable to outside interference, though is reliant on installation accuracy and calibration, having rigorously proven the calculations used to derive the Embedded Computer decisions.

Additionally, in the 2014 World Cup referees are wearing smartwatches as part of a GoalControl-4D system to alert them to goal-line technology cameras detecting goals.

Both systems also can’t consider the change in shape of a ball when it bounces, for example. The Hawk-Eye system, prior to soccer, has long been employed in snooker (similar to billiards), cricket, and tennis. Bounce distortion in soccer, given we’re concerned with it passing a line, not falling short of it, isn’t relevant – in tennis however this can be contentious; during the 2008 Wimbledon final, a ball that appeared out was cited as “in” by Hawk-Eye by a single millimeter.

refer to:
http://embedded-computing.com/articles/embedding-world-cup-goal-line-technology/

 

Increasing Data Throughput with Innovative Embedded System

Developed by the PCI-SIG consortium in response to SSD’s increasing demands on data throughput, M.2, formerly known as Next Generation Form Factor (NGFF) is a new specification for expansion modules in embedded systems with space limitations.

Slimmer and more flexible than the current Mini PCI Express (mPCIe)/Mini-SATA (mSATA) standard, M.2 does not introduce new signaling systems but rather allows for increased data throughput via multi-lane PCI Express (PCIe), and backward compatibility via SATA and USB signals. While driven by the demand for high-speed, high-capacity storage in ultrabooks, tablets and portable devices, M.2’s space-efficient form factor, backward automation, and flexibility mean it will have an impact on the embedded sector as well.

The unique needs and requirements of embedded systems make the adoption of M.2 a more complicated decision in this space than on the consumer side, but understanding the background of the technology, its specifications, and benefits can help embedded OEMs and system designers make the right choices now and prepare for the future.

The current automation of small form factor expansion modules for both storage and general peripherals uses a common 30 mm x 50.95 mm mPCIe card form factor (Figure 1). Designed originally for the notebook market as an evolution of MiniPCI, mPCIe is a physical and electrical specification for expansion cards allowing Wi-Fi, Wireless Wide Area Network (WWAN), and other add-on functionality via a miniaturized PCIe connector. mPCIe’s widespread adoption in consumer applications, small form factor and its use of the familiar PCIe bus meant it naturally became a convenient and space-efficient way to add functionality to industrial and embedded systems.

As demand for single board computer in notebooks and mobile devices grew, in 2009, the mSATA format was introduced as a small form factor for storage, utilizing the same physical form factor and connector as mPCIe with a miniaturized SATA interface. While physically similar to mPCIe in both form factor and connector, single board computer are electrically different from mPCIe and require mSATA host support to function. Being based on the tried and true SATA storage protocol, mSATA made it easy for manufacturers to implement small form factor storage and it was rapidly adopted in the client space. These Embedded SBC have made mSATA attractive for embedded system storage and today it is one of the most popular small form factor SSD formats in both consumer and industrial markets.

As the client and automation markets pursue higher capacity embedded SBC and higher throughputs to match, the performance bottleneck for top-end single board computer has become the SATA protocol which is limited to 600 MBps. With increased capacities on embedded SBC, speeds go up as well and even the 600 MBps offered by SATA III is not enough for high-performance applications. At the same time, the automation which mSATA was based physically limited how much flash could be put on one mSATA card.

Single board computer strength as a small form factor lies not just in its potential for the next generation of high-performance SSDs, but also in its backward compatibility. While supporting high-performanced automation over multi-lane PCIe, M.2 also supports SATA, USB, and single-lane PCIe. As NVMe awaits adoption in the marketplace, SATA-based first-generation M.2 storage cards and M.2 peripheral cards can allow space-constrained systems to benefit from the smaller and more flexible form factor with the reliability and compatibility of SATA.

For general embedded system applications, mSATA and mPCIe are not going anywhere soon. Industrial applications have modest performance needs, emphasizing reliability and consistency instead. Even for performance-driven systems, the near-term value proposition is tenuous as the full performance benefits of M.2 SSDs require either NVMe support or proprietary drivers to realize native PCIe speeds. It will take time for the storage environment to support NVMe before embedded applications will be able to enjoy this level of performance, so current-generation M.2 embedded SBC may be a hard sell over mSATA modules in the embedded space. Meanwhile, mPCIe currently offers more than enough bandwidth for general embedded peripherals such as graphics cards or Wi-Fi modules.

refer to:
http://embedded-computing.com/articles/increasing-data-throughput/

Opening Doors to Embedded Automation

At the ultra-clean and newly expanded MINOR’s food processing plant in Cleveland a forklift picks up a bin of their product and carries it into the next room along the line, entering through an airlock to minimize the entry of automation pathogens into the packaging area. But unlike most facilities the forklifts here never take a break other than for a battery charge because there is no one sitting in the driver’s single board computer.

Nor is there a driver activating door operation. The signal to open and close is generated by the same process management system directing forklift travel.

MINOR’S has joined the growing ranks of companies that are putting automation material handling (AMH) vehicles to work, seeking increases in productivity and lower operating costs. A recent article in Fast Company on embedded SBC pending reveals that scientists are developing a embedded SBC that has already logged 500,000 miles. So it’s no surprise in the more controllable world of the manufacturing plant and with industry’s growing need for efficiency, speed and reliability; embedded system will be acquiring minds of their own.

The recently released Material Handling and Logistics US Roadmap, complied by the national supply chain publications and associations, looks at the industry ten years into the future. Among the ten megatrends unfolding in the next decade, the report predicts that “autonomous control and distributed intelligence” could one day extend to driverless equipment in the warehouse and over the road.

Engine maker  envisions unmanned  embedded SBC cargo ships, though many in the industry don’t think they will be sailing any time soon. Nevertheless, these technological changes will be driven by a changing embedded system, the growth of e-commerce, mass personalization and of course never-ceasing competition – all of which have impact on the factory or single board computer.

Industry  automation isn’t waiting for 2025. A report published by the Priority Metrics Group detailed that AMH vehicle sales exceeded $15.5 billion world-wide in 2011, up 18% over the previous year. This represents roughly 15% of the investment in new equipment.However, these vehicles also cannot wait for the doors within the plant to get out of the way.

Within these plants are walls sectioning off rooms; and like walls, doors are supposed to preserve the integrity of the processes or the inventories in the room while allowing traffic to pass in and out of the room. Just about every room maintains its own microclimate with a proper temperature. Humidity and air flow are controlled for whatever process takes place or for the product handled within it.

Doors ensure that these areas maintain those conditions, protecting the room from pressure differentials, extreme temperatures sparks, fumes, drafts, noise or other conditions in the previous room that could adversely affect work in process, employee productivity and building energy costs. But if the doors can’t get out of the way in time, progress goes nowhere.

To keep pace with embedded system that demand this speed, the doors along the material path must be able to do the following:

Open and Close Rapidly – The lumbering automation panel door is a thing of the past. For any door to be a member of today’s material handling team it must be an overhead roll up style to get out of the way of vehicles and to attain the high speeds necessary for efficient product flow. These single board computer also take up minimal wall space to maximize these areas for shelving or machinery.

These doors now are capable of speeds of 60 inches per second and faster, and can be fully opened in under two seconds for a typical eight-foot high door embedded system. The rapid roll up door minimizes room exposure, giving practically no time for energy to escape or contaminants to invade.

At MINOR’S ultra-pure food processing facility, their specially designed automated single board computer from one room to another. The concern of process engineers at this operation is to minimize contaminants throughout the processing chain. To maintain product quality, entrance/exit is through an airlock

refer to:
http://www.automation.com/automation-news/todays-featured-news-headlines/opening-doors-to-automation