Posted on

Maritime Shipping Industry Ripe for AI Disruption


The maritime shipping industry is ripe for disruption by AI, with startups positioning to help established shippers exploit the potential. (GETTY IMAGES)

By AI Trends Staff

The maritime shipping industry is ripe for disruption by AI, with startups positioning to help established shippers exploit the potential.

The industry naturally produces huge amounts of data, and opportunities exist at every step of the supply chain for stakeholders to use AI to augment their operation with positive effects. The convergence of AI and the Internet of Things (IoT) also offers the potential of a more connected intelligence.

At every step in the supply chain, there are opportunities for stakeholders to use AI to positively augment their operation.

This will include predictive analytics – what will happen – prescriptive analytics – what should we do – and adaptive analytics – how should the system adapt to the latest changes, according to an account from Pacific Green Technologies Group. Major cargo shipping companies including Kongsberg, Rolls Royce, Maersk and Wartsila all know that change and developments brought on by AI in the field are poised to leap exponentially.

Startup Sea Machines of Boston is currently testing its perception and situational awareness technology aboard one of Maersk’s newest Winter Palace ice-class container ships. Several other installations are scheduled.

While it would not make the ship autonomous, it is a step towards self-steering vessels, employing technology similar to that of self-driving car features, collecting streams of data from a vessel’s environmental surroundings, identifying and tracking potential conflicts, and displaying the knowledge in the wheelhouse.

Sea Machine’s team includes experienced managers from marine construction, salvage, offshore oil and gas, world-class automation engineers, and autonomy scientists.The company has raised $12.3 million in funding so far, according to Crunchbase.

Orca AI of Tel Aviv offers a collision avoidance system used in marine navigation. The system applies AI to data provided by vision and other sensors. The company is committed to reducing human errors in maritime shipping through use of intelligent, automated vessels. The system helps the captain and navigation crew get an accurate view of the environment in real time, thus assisting in making decisions.

Major shipping companies are also involved in developing their own AI systems for navigation. Wartsila Guidance Marine, a unit of the shipping company Wartsila, in 2018 launched SceneScan, a system that uses laser position reference sensors to guide navigation in harbors. Tracking information is provided relative to structures within the sensor field of view. The system matches its current observation of the scene against a map generated from previous observations of the scene.

Wartsila Guidance Marine successfully completed sea trials of the SceneScan system in April 2019 aboard the Topaz Citadel, a vessel owned by Topaz Energy and Marine, a leading international offshore support vessel company.

Disruption Likely to Include Job Loss in Marine Industry

Longer term, autonomous shipping in the maritime shipping industry is leading to disruption including significant job loss, suggests a recent report in Sea News.

“Autonomous shipping is the future of the maritime industry. As disruptive as the smartphone, the smart ship will revolutionize the landscape of ship design and operations,” stated Mikael Mäkinen, President, Marine at Rolls-Royce Plc.

Mikael Mäkinen, President, Marine at Rolls-Royce Plc.

The timeline for delivery on the promise of autonomous ships is stretched out. Estimates are that the first remote-controlled, unmanned coastal vessels will not be launched until 2025. Fully-autonomous unmanned coastal vessels are not expected until 2035, according to a report by Nautix of Copenhagen, a company offering marine fleet management software.

The three founders of Nautix started their careers at the Singapore Maritime Academy in 2003. In ensuring years, they gained experience in the maritime industry working as deck officers, engineers, superintendents and software innovation managers.

“We’ve felt the pain of our colleagues being let down by the sub-standard tools they’ve been provided. We want to change the status quo. We have the software expertise and the technical knowledge to make a difference,” states Tarang Valecha, co-founder and CEO of Nautix on the company’s website.

Tarang Valecha, co-founder and CEO of Nautix

Serious challenges remain, not just technical in nature. International guidelines and regulations regarding autonomous ships are not likely to be agreed upon within the next decade. The International Transport Workers Federation (ITF) has suggested remote control vessels will lack the skills, knowledge and experience of professional seafarers, so that in the event of an accident or incident requiring immediate attention, the autonomous vessel could be at risk.

The ITF and the International Federal of Shipmasters’ Associations (IFSMA) are very concerned about job loss. Today the industry employs an estimated 1.6 million people on ships and land, who carry out 90 percent of world trade. More than 80 percent of seafarers surveyed by these two organizations have anxiety about possible job losses with the advent of AI and automation.

A study from Oxford University estimated that 47% of US jobs in the maritime industry could be lost over the next 20 years, low-skilled and high-skilled jobs.

Read the source articles and studies from  Pacific Green Technologies Group and in Sea News.

Read More

Posted on

UPS Committed to Electric Vehicles on an AI Platform


UPS has made a major commitment to electric delivery vehicles incorporating AI, including these bikes in Paris. (UPS PRESS SERVICE)

By AI Trends Staff

UPS, the logistics and delivery company, is spearheading a project in England to explore how AI systems can optimize the charging of electric fleet vehicles, and help integrate onsite renewable energy resources at vehicle depots.

The EV Fleet-Centered Local Energy Systems (EFLES) project is scheduled to start in May at the UPS depot in the Camden borough of London, according to a recent account in electrive. The UK Power Networks Services will provide oversight, while the smart battery and EV-charging software provider Moixa will contribute its GridShare smart AI platform to manage solar, storage and charging assets.

“We have the global expertise, smart-charging infrastructure and resources to host this first-of-a-kind test bed at our Camden facility,” stated UPS sustainable development coordinator Claire Thompson-Sage. “This project will build on our EV infrastructure technology to help develop a holistic local energy system.”

Claire Thompson-Sage, UPS Sustainable Development Coordinator

The GridShare software helps track hundreds of data sources for energy prices, power demand, weather conditions and more, to help determine which charging times are less expensive and which mix of renewable energy makes the most sense at any given point in time.

The ERLES project is the next stage in the UPS partnership with Arrival, the UK-based “generation 2” electric vehicle manufacturer, which developed its newest vehicle with UPS.

UPS recently placed an order for 10,000 electric vehicles from Arrival, to be delivered from 2020 to 2024, according to a recent press release issued by Arrival.

UPS co-developed the Generation 2 vehicles with Arrival, which employed a new method of assembly using low capital, low-footprint micro-factories located to serve local communities and be profitable making thousands of units. The UPS partnership with Arrival was first announced in 2016.

“UPS has been a strong strategic partner of Arrival, providing valuable insight to how electric delivery vans are used on the road and how they can be optimized for drivers, stated Denis Sverdlov, founder and CEO of Arrival. “Together our teams have been creating bespoke [custom] electric vehicles, based on our flexible skateboard platforms, that meet the end-to-end needs of UPS from driving, loading/unloading, depot and back office operations.”

Denis Sverdlov, founder and CEO of Arrival

Carlton Rose, President of UPS Global Fleet Maintenance & Engineering, stated, “Our investment and partnership with Arrival is directly aligned with UPS’s transformation strategy, led by the deployment of cutting-edge technologies.These vehicles will be among the world’s most advanced package delivery vehicles, redefining industry standards for electric, connected and intelligent vehicle solutions.”

UPS Has Had Long Commitment to Electric Vehicles

UPS had 1,000 electric vehicles in its fleet of 112,000 vehicles two years ago. The cost of the vehicle new was found to be no more than regular diesel vehicles, because the cost of electric batteries plummeted 80 percent in six years, according to a UPS press release from April 2018.The electric transporters are expected to create additional value of UPS in operational savings and routing efficiency.

In the U.S., UPS has been working with Workhorse to develop an electric transport vehicle. The target at the outset was a range of 100 miles, with similar procurement costs as an internal combustion motor. The founder and CEO of Workhorse, Steve Burns, late last year bought the Lordstown, Ohio electric vehicle plant from General Motors, through a company he set up to execute the transaction, Lordstown Motors. He has said he wants to build electric pickup trucks for “business and government customers” and has decided the name of the first model will be: Endurance.

Financially, Workhorse has faced some challenges, losing $38 million in 2019 and having little sales in late 2019, according to a recent account in The Verge. Workhorse will own 10 percent of Lordstown Motors, and license to it the intellectual property related to the planned W-15 electric pickup truck. Burns will transfer 6,000 pre-orders for the truck to Lordstown. He is searching for financing, saying he needs $300 million to start product in a year. He plans to run a union shop and produce 500,000 vehicles per year, double the number of Cruze sedans GM made at the plant.

In the case of these new trucks, UPS worked closely with a supplier, Workhorse, to redesign the trucks “from the ground up,” stated Scott Phillippi, UPS’s senior director of maintenance and engineering. Phillippi expects the new design will reduce the truck’s weight by some 1,000 pounds, compared with a diesel or gas-powered vehicle. That plus better batteries will give the truck an electric range of around 100 miles, enough for most routes in and around cities.

Read the source articles and releases at electrive, a UPS press release on the 10,000 vehicle order from Arrival,  and in The Verge., and a UPS press release from April 2018.

Read More

Posted on

Autonomous Freight Trains Powered by AI Coming

By AI Trends Staff
Driverless trains powered by AI are coming. Driverless train software produced by New York Air Brake was used in a demonstration last summer of a 30-car freight train traveling 48 miles at a research and testing facility owned by the Association of American Railroads, according to a …

Read More

Posted on

AI Put to Work to Help US Steel Industry Stay Competitive


US steel manufacturers are implementing enterprise AI to help gain efficiencies and reduce downtime. (GETTY IMAGES)

By AI Trends Staff

As the US steel industry looks for ways to lower costs in a global market facing slowing demand, a modern steel plant in Arkansas is using AI to help it become more competitive.

The Big River Steel Mill, which began operating in January 2017, melts scrap metal and produces steel for more than 200 customers, including four automakers, according to a recent account in  WSJPro.

The plant’s AI system has been designed by Noodle Analytics of San Francisco, which uses deep learning and neural networks to continually train algorithms on data captured by thousands of sensors.

“We’re using the best available technology and pressing that technology farther, we think, than anyone in the steel industry,” stated Big River Chief Executive David Stickler, a veteran of the steel, mining and recycling industries. “Any future steel facilities that are built will try to capitalize on what we’ve done and replicate it.”

An environment of falling steel prices and a decline in demand from manufacturers is creating  an opportunity for newer plants with lower operating costs. The hope for the AI at Big River is that it will lower operating costs and help to sell unused power when demand for electricity is high.

David Stickler, Chief Executive, Big River Steel Mill

One expert credited Big River for being at the cutting edge of steel mill technology. It is the world’s first steel plant designed to manage its operations with the aid of “artificial intelligence from the drawing board,” stated Ron Ashburn, executive director of the Association for Iron & Steel Technology.

Big Steel started the AI project in 2017, collecting and analyzing data and training algorithms used to predict maintenance requirements for new machinery. The system collects data on equipment conditions, assessing wear and tear in the hopes of reducing shutdown time and gaining operating hours.

Noodle.AI is also working with SSAB Americas, a global steel manufacturer, to pair the company’s Enterprise AI Platform with its sensor data with external data to help plan business operations, according to an account in Robotics Business Review. The plan is to improve machinery uptime, engage in predictive maintenance and seek ways to optimize the plant.

“We are excited to implement new digitalization technologies and to explore how the application of Enterprise AI can impact our performance and create a competitive advantage,” stated Tom Toner, Vice President of Operations for SSAB Americas. “Our goal is to learn how we can increase efficiency and decrease any bottlenecks in our operations with this advanced technology.”

Noodle.ai’s founder and CEO Steve Pratt stated, “SSAB Americas is a pioneering manufacturing company that is looking to embrace new technologies to improve the quality of their products, service to customers and competitiveness.”

The steel industry has seen disruption in the past two decades by steel plant capacity added in China, which now produces 50% of the world’s steel. As the Chinese began to export excess inventory at lower prices, it put pressure on western producers. As a result, steel manufacturers in the west are concentrating on improving efficiency by modernizing, according to an account written by Hiranmay Sarkar, a managing partner with Tata Consultancy Services, in SupplyChainBrain.

The steel making labor force has been reduced in favor of automation in the last 25 years, a period when world steel production grew by two and a half times, and the industry has reduced the workforce by more than 1.5 million members, Sarkar reported.

A digital twin is a digital replica of a physical asset, including its systems and devices. The twin can serve as the backbone for cyber-physical integration, enabling seamless transition of data between digital and physical worlds. To enable enterprise AI, Sarkar suggests, the digital twin needs to have these attributes:

  • An ecosystem commerce platform, off-the-shelf software, for information exchange with internal and external business partners;
  • Physical equipment connectivity and event capture, through IoT devices. This ensures real time data collection at various nodes of the supply chain, such as ore storage by miners, suppliers and vessel operators, production by coke oven, blast furnace and mill, product store and distribution by yards and freight transporters.

Read the source articles in WSJPro, Robotics Business Review and SupplyChainBrain.

Posted on

deepsense.ai, Volkswagen Train Real World Autonomous Vehicle Entirely In Simulation


Volkswagen worked with AI experts at deepsense.ai to create a synthetic dataset to train autonomous cars in how to drive. (GETTY IMAGES)

By AI Trends Staff

deepsense.ai and Volkswagen have published research on arXiv showing that an autonomous vehicle trained entirely in simulation can drive in the real world. The team trained its policies using a reinforcement learning algorithm with “mostly synthetic data”, then transferred its neural network into a real car, providing it with over 100 years’ driving experience before the real-world engine had ever started.

“Moving the neural network policy from the simulator to reality was a breakthrough and the heart of the experiment,” explained Piotr Miłoś, deepsense.ai researcher and a professor at the Polish Academy of Sciences in a statement about the work. “We have run multiple training sessions in the simulated environment, but even the most sophisticated simulator delivers different experiences than what a car encounters in the real world. This is known as the sim-to-real gap. Our algorithm has learned to bridge this gap. That we actually rode in a car controlled by a neural network proves that reinforcement learning-powered training is a promising direction for autonomous vehicle research overall.”

The experiment was conducted by a team of researchers including deepsense.ai’s Błażej Osiński, Adam Jakubowski, Piotr Miłoś, Paweł Zięcina and Christopher Galias, Volkswagen researcher Silviu Homoceanu and University of Warsaw professor Henryk Michalewski. The team not only trained a model that controls a car in a simulated environment but also executed test drives in a real car at Volkswagen’s testing facility. The car navigated through real streets and crossroads, performing real-world driving maneuvers.

The team used the CARLA simulator, and open-source simulator for autonomous driving research based on Unreal Engine 4 and tested 10 models with different driving and input variables.

Simulations Offer Advantages

Being able to use simulation offers crucial advantages, the team says. First, it is cheaper. In their combined experiments, the team generated some 100 years of simulated driving experience. Racking up so much experience in a real car isn’t feasible. Training in a simulator can also be done much faster. The agent can experience all manner of danger, from simple rain to deadly extreme weather, accidents to full-blown crashes, and learn to navigate or avoid them. Subjecting an actual driver to such hazards would be prohibitively complicated, time-consuming, and ethically unacceptable. What’s more, extreme scenarios in the real world are relatively uncommon but can be quickly simulated.

“It was exciting to see that our novel approach worked so well. By further exploring this path we can deliver more reliable and flexible models to control autonomous vehicles,” said Blazej Osinski, a data scientist at deepsense.ai. “We tested our technique on cars, but it can be further explored for other applications.”

Błażej Osiński, Data Scientist, deepsense.ai

Reinforcement learning allows a model to shape its own behavior by interacting with the environment and receiving rewards and penalties as it goes. The model’s goal is, ultimately, to maximize the rewards it receives while avoiding penalties. An autonomous car receives points for safe driving and complying with the traffic laws. Simulating every road condition that can occur is impossible but shaping a set of guidelines like “avoid hitting objects” or “protect passenger from any and all harm” is not. Ensuring the model sticks to the guidelines makes it more reliable, including in less common situations it will eventually face on the road.

“Using reinforcement learning has reduced the amount of human engineering work,” explained Osinski. “We didn’t have to create driving heuristics and to collect reference drives; instead we only had to define desired outcomes (making progress on a route) and undesired behaviors (crashes, deviating from you line, etc.). Based on this rewards and punishment RL is able to figure out the rest.”

There are many next steps for the team, Osinski said. “We can definitely increase the robustness of our system against more conditions and higher driving speeds. We would also like to tackle the situations requiring ‘assertive’ interactions with other drivers, such as changing lanes in a heavy traffic.”

Technical details of the research are described in the team’s paper, available on arXiv.

Source: AI Trends

Posted on

AI Initiatives in Manufacturing Often Loosely Defined, Survey Finds


Manufacturers are pursuing AI in a measured approach, gaining experience and dealing with challenges. (GETTY IMAGES)

By AI Trends Staff

Many AI initiatives are loosely defined, lack proper technology and data infrastructure, and are often failing to meet expectations, according to a new report from Plutoshift on implementation of AI by manufacturing companies.

A supplier of an AI solution for performance monitoring, Plutoshift surveyed 250 manufacturing professionals with visibility into their company’s AI programs. Overall, the survey found that manufacturing companies are gaining experience while taking a measured approach to implementing AI.

Among the specific findings:

  • 61% said their company has good intentions but needs to reevaluate the way it implements AI projects;
  • 17% said their company was in full implementation stage of their AI project;
  • 84% are not yet able to automatically and continuously act on their data intelligence, while some are gathering data;
  • 72% said it took more time than anticipated for their company to implement the technical/data collection infrastructure needed to take advantage of AI
  • Only 57% said their company implemented AI projects with a clear goal, while almost 20% implemented AI initiatives due to industry or peer pressure to utilize the technology.
  • 17% said their company implemented AI projects because their company felt pressure to utilize this technology from the industry
  • 60% said their company struggled to come to a consensus on a focused, practical strategy for implementing AI

Among its conclusions, the report stated, “To truly utilize data, manufacturing companies need a data infrastructure and platform that is designed around performance monitoring for the physical world. That means gaining the ability to take data from any point in the workflow, analyze that data, and provide reliable predictions at any point. Right now, few companies report these full capabilities and would rethink their direction.”

Plutoshift CEO and Founder Prateek Joshi stated in a  press release about the survey, “Companies are forging ahead with the adoption of AI at an enterprise level. Despite the progress, the reality that’s often underreported is that AI initiatives are loosely defined. Companies in the middle of this transformation usually lack the proper technology and data infrastructure. In the end, these implementations can fail to meet expectations. The insights in this report show us that companies would strongly benefit by taking a more measured and grounded approach toward implementing AI.”

Prateek Joshi, CEO and Founder, Plutoshift

Biggest Players Investing and Gaining Valuable Experience with AI

Another way to gauge how AI is or will penetrate manufacturing is to examine what the biggest players are doing. Siemens, GE, FANUC, and KUKA are all making significant investments in machine learning-powered approaches to improve manufacturing, described in a recent account in emerj. They are using AI to bring down labor costs, reduce product defects, shorten unplanned downtimes and increase production speed.

These giants are using the tools they are developing in their own manufacturing processes, making them the developer, test case, and first customers for many advances.

The German conglomerate, Siemens, has been using neural networks to monitor its steel plants and improve efficiencies for decades. The company claims to have invested around $10 billion in US software companies (via acquisitions) over the past decade. In March of 2016, Siemens launched Mindsphere, described as an “IoT operating system,” and a competitor to GE’s Predix product. Siemens describes Mindsphere as a smart cloud for industry, being able to monitor machine fleets throughout the world. In 2016, it integrated IBM Watson Analytics into its tools service.

Siemens describes an AI success story with its effort to improve gas turbine emissions. “After experts had done their best to optimize the turbine’s nitrous oxide emissions,” stated Dr. Norbert Gaus, Head of Research in Digitalization and Automation at Siemens Corporate Technology, “our AI system was able to reduce emissions by an additional ten to fifteen percent.”

Siemens envisions incorporating its AI expertise within Click2Make, its production-as-a-service technology. It was described in an account in Fast Company in 2017 as a “self-configuring factory.”  Siemens envisions a market where companies submit designs and factories with the facilities and time and handle the order would start an automatic bidding process. The manufacturer would be able to respond with the factory configuring itself. That’s the idea.

Dr. Norbert Gaus, Head of Research in Digitalization and Automation, Siemens Corporate Technology

GE’s Manufacturing Software Strategy a Work in Progress

GE, which has had fits and starts with its software strategy, has over 500 factories worldwide that it is transforming into smart facilities. GE launched its Brilliant Manufacturing Suite for customers in 2015. The first “Brilliant Factory” was built that year in Pune, India, with a $200 million investment. GE claims it improved equipment effectiveness by 18%.

Last year, GE sold off most of the assets of its Predix unit. An account in Medium described reasons for the retrenchment, including a decision to build a Predix cloud data center, and not recognize the competition from Amazon, Microsoft, and Google. Another criticism was that Predix was not known to be developer-friendly. Successful platforms need developer content, and developers need support from a community.

GE’s software strategy in manufacturing is a work in progress.

FANUC Has Invested in AI

FANUC, the Japanese company producing industrial robotics, has made substantial investments in AI. In 2015, Fanuc acquired a stake in the AI startup Preferred Networks, to integrate deep learning into its robots.

In early 2016, FANUC announced a collaboration with Cisco and Rockwell Automation to develop and deploy FIELD (FANUC Intelligent Edge Link and Drive). This was described as an industrial IoT platform for manufacturing. Just a few months later,  with NVIDIA to use their AI chips for their “the factories of the future.”partnered with NVIDIA to use their AI chips for their “the factories of the future.”

FANUC is using deep reinforcement learning to help some of its industrial robots . They perform the same task over and over again, learning each time until they achieve sufficient accuracy. By partnering with NVIDIA, the goal is for multiple robots can learn together. The idea is that what could take one robot eight hours to learn, eight robots can learn in one hour. Fast learning means less downtime and the ability to handle more varied products at the same factory.train themselves. They perform the same task over and over again, learning each time until they achieve sufficient accuracy. By partnering with NVIDIA, the goal is for multiple robots can learn together. The idea is that what could take one robot eight hours to learn, eight robots can learn in one hour. Fast learning means less downtime and the ability to handle more varied products at the same factory.

KUKA Working on Human-Robot Collaboration

KUKA, the Chinese-owned, Germany-based manufacturer of industrial robots, is investing in human-robot collaboration. The company has developed a robot that can work beside a human safely, owing to its intelligent controls and high-performance sensors. KUKA uses them; BMW is also a customer.

Robots that can work safely with humans will be able to be deployed in factories for new tasks, improving efficiency and flexibility.

Read the Plutosoft manufacturing study press release; read the source articles in  emerj,  Fast Company and Medium.

Source: AI Trends

Posted on

AI Innovation in Manufacturing, Robotics, New Apps on Display at CES


AI was at the center of much innovation on display at the recent CES trade show, with highlights in the manufacturing industry and in robotics. (GETTY IMAGES)

By AI Trends Staff

AI was showcased in many areas at the recent CES trade show in Las Vegas, attended by over 4,400 exhibitors and 170,000 attendees.

In manufacturing, the majors were showing off their AI prowess.

A new washing machine from LG Electronics uses AI to precisely clean clothes. Internal sensors in the AI DD washer detect the load volume and weight as well as the clothing fabric, according to an account in the WSJ Pro. The washing machine’s AI models then compare those details against 20,000 data points to determine the optimal cycle settings for the laundry. A load of T-shirts and pants needs a certain type of wash, temperature and wash time for best results; the AI figures it out and sets it.

John Deere & Co. showed off its See & Spray machine pulled behind a tractor, using vision and machine learning to detect weeds and determine plant health. See & Spray came with Deere’s acquisition of Blue River Technology in 2017 for $305 million. The technology is said to help reduce agrochemical use. The company is suggesting farmers can save up to $30 per acre by using See & Spray. “We’ve got AI in production, on machines, today and there is more coming,” stated John Stone, senior vice president for the Intelligent Solutions Group at Deere.

Mercedes-Benz is developing an AI system for its Sprinter vans to assist workers in loading packages in an optimal way. Expected to be available within a year, Coros—“cargo recognition and organization system”—is intended to help with “last-mile” logistics. Cameras in the van cargo section read package barcodes, then refer to a model that assesses the package size and where it best belongs. A blue light comes on in the shelf section where the package should be placed. The system identifies the package when it is being picked up for delivery as well. If a package does not belong in a vehicle, red lights come on. The hope is to optimize package loading and reduce training costs.

Innovations Exploiting the Power of AI

Innovations were on display among the many exhibitors at CES. An account in Forbes highlighted a selection, including:

Whisk, a smart food platform acquired by Samsung’s innovation-focused subsidiary Samsung Next earlier this year, is now capable of scanning the contents of your refrigerator and suggesting dishes to cook. The AI model refers to research from over 100 nutritionists, food scientists, engineers, and retailers. The company suggests the technology will also help to reduce food waste.

Wiser from Schneider Electric is a small device that monitors energy use by each home appliance in real-time. It installs in the home circuit breaker box; its machine learning models are aimed at optimizing savings, including for solar systems.

A vital signs monitoring app from Binah.ai analyzes a person’s face to detect medical insights. The app detects oxygen saturation, respiration rate, heart rate variability and mental stress. Plans are to add monitoring for hemoglobin levels and blood pressure.

Robots Extending into Homes, Humanoid Forms

Interesting robots incorporating AI was the focus of an account from CES from TechRepublic; including the following examples.

Samsung’s Ballie, still a research project, is a tennis ball-sized life companion, reported ZDNet. Ballie is a small, round, rolling robot designed to support, understand, and react to the needs of its owner, specifically in households. Ballie’s AI capabilities use sensors and data within the home to attempt an immersive experience. The device will be designed to be able to connect with and control other smart devices in the home.

Graphics chipmaker NVIDIA showed its reach in robotics at CES, with its GPU chips used in a number of innovative robots. For example, Toyota’s new humanoid robot T-HR3, along with its Jetson AGX Xavier computer. The T-HR3 uses advanced synchronization and master maneuvering to move smoothly and control the force of its body. The system is controlled by a human operator wearing a virtual reality (VR) headset. T-HR3 receives the data through augmented video and perception data via a NVIDIA’s Jetson AGX Xavier computer within the robot, according to a NVIDIA blog post.

Other NVIDIA-powered robots at CES, cited in an account in ZDNet, included an autonomous wheelchair from WHILL powered by a Jetson TX2, a home security drone from Sunflower Labs, a delivery robot from PostMates and an inspection snake robot from Sarcos.

Read the source articles in WSJ Pro, Forbes and   TechRepublic.

Source: AI Trends