platform Archives - AI News https://www.artificialintelligence-news.com/tag/platform/ Artificial Intelligence News Wed, 06 Jul 2022 16:39:22 +0000 en-GB hourly 1 https://www.artificialintelligence-news.com/wp-content/uploads/sites/9/2020/09/ai-icon-60x60.png platform Archives - AI News https://www.artificialintelligence-news.com/tag/platform/ 32 32 Vodafone and Google launch AI Booster platform https://www.artificialintelligence-news.com/2022/07/06/vodafone-google-launch-ai-booster-platform/ https://www.artificialintelligence-news.com/2022/07/06/vodafone-google-launch-ai-booster-platform/#respond Wed, 06 Jul 2022 16:39:21 +0000 https://www.artificialintelligence-news.com/?p=12143 A new platform launched by Vodafone and Google called AI Booster aims to handle thousands of ML models a day across 18+ countries. AI Booster is the result of 18 months of development and is built upon Google’s Vertex AI and integrates with Vodafone’s Neuron platform. Vertex AI, among other Google technologies, had not been... Read more »

The post Vodafone and Google launch AI Booster platform appeared first on AI News.

]]>
A new platform launched by Vodafone and Google called AI Booster aims to handle thousands of ML models a day across 18+ countries.

AI Booster is the result of 18 months of development and is built upon Google’s Vertex AI and integrates with Vodafone’s Neuron platform.

Vertex AI, among other Google technologies, had not been officially announced when Vodafone started development on AI Booster.

Cornelia Schaurecker, Global Group Director for Big Data & AI at Vodafone, said:

“To maximise business value at pace and scale, our vision was to enable fast creation and horizontal/vertical scaling of use cases in an automated, standardised manner. To do this, 18 months ago we set out to build a next-generation AI/ML platform based on new Google technology, some of which hadn’t even been announced yet.

We knew it wouldn’t be easy. People said, ‘Shoot for the stars and you might get off the ground…’ Today, we’re really proud that AI Booster is truly taking off, and went live in almost double the markets we had originally planned.

Together, we’ve used the best possible ML Ops tools and created Vodafone’s AI Booster Platform to make data scientists’ lives easier, maximise value, and take co-creation and scaling of use cases globally to another level.”

Google’s Vertex AI lets customers build, deploy, and scale ML models faster, with pre-trained and custom tooling within a unified platform.

Ashish Vijayvargia, Analytics Product Lead at Vodafone, commented:

“As a technology platform, we’re incredibly proud of building a cutting-edge MLOps platform based on best-in-class Google Cloud architecture with in-built automation, scalability, and security.

The result is we’re delivering more value from data science while embedding reliability engineering principles throughout.”

Vodafone highlighted four key features of AI Booster:

  1. Automated ML lifecycle compliance activities (drift/skew detection, explainability, auditability, etc.) via reusable pipelines, containers, and managed services.
  2. Embedded security by design.
  3. Capitalise on Google-native ML tooling using BQML, AutoML, Vertex AI and others.
  4. Boost adoption through standardised and embedded ML templates.

The new platform reportedly enables Vodafone’s data scientists and developers to slash the time of going from proof-of-concept to production from five months to just four weeks.

Vodafone and Google have forged a very close relationship in recent years. In May 2021, the two companies extended their relationship to build a global data platform.

“Vodafone’s flourishing relationship with Google Cloud is a vital aspect of our evolution toward becoming a world-leading tech communications company,” said Cengiz Ucbenli, Global Head of Big Data and AI, Innovation, Governance at Vodafone.

“It accelerates our ability to create faster, more scalable solutions to business challenges like improving customer loyalty and enhancing customer experience, whilst keeping Vodafone at the forefront of AI and data science.”

(Image Credit: Vodafone)

Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London.

Explore other upcoming enterprise technology events and webinars powered by TechForge here.

The post Vodafone and Google launch AI Booster platform appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2022/07/06/vodafone-google-launch-ai-booster-platform/feed/ 0
Stefano Somenzi, Athics: On no-code AI and deploying conversational bots https://www.artificialintelligence-news.com/2021/11/12/stefano-somenzi-athics-no-code-ai-deploying-conversational-bots/ https://www.artificialintelligence-news.com/2021/11/12/stefano-somenzi-athics-no-code-ai-deploying-conversational-bots/#respond Fri, 12 Nov 2021 16:47:39 +0000 https://artificialintelligence-news.com/?p=11369 No-code AI solutions are helping more businesses to get started on their AI journeys than ever. Athics, through its Crafter.ai platform for deploying conversational bots, knows a thing or two about the topic. AI News caught up with Stefano Somenzi, CTO at Athics, to get his thoughts on no-code AI and the development of virtual... Read more »

The post Stefano Somenzi, Athics: On no-code AI and deploying conversational bots appeared first on AI News.

]]>
No-code AI solutions are helping more businesses to get started on their AI journeys than ever. Athics, through its Crafter.ai platform for deploying conversational bots, knows a thing or two about the topic.

AI News caught up with Stefano Somenzi, CTO at Athics, to get his thoughts on no-code AI and the development of virtual agents.

AI News: Do you think “no-code” will help more businesses to begin their AI journeys?

Stefano Somenzi: The real advantage of “no code” is not just the reduced effort required for businesses to get things done, it is also centered around changing the role of the user who will build the AI solution. In our case, a conversational AI agent.

“No code” means that the AI solution is built not by a data scientist but by the process owner. The process owner is best-suited to know what the AI solution should deliver and how. But, if you need coding, this means that the process owner needs to translate his/her requirements into a data scientist’s language.

This requires much more time and is affected by the “lost in translation” syndrome that hinders many IT projects. That’s why “no code” will play a major role in helping companies approach AI.

AN: Research from PwC found that 71 percent of US consumers would rather interact with a human than a chatbot or some other automated process. How can businesses be confident that bots created through your Crafter.ai platform will improve the customer experience rather than worsen it?

SS: Even the most advanced conversational AI agents, like ours, are not suited to replace a direct consumer-to-human interaction if what the consumer is looking for is the empathy that today only a human is able to show during a conversation.

At the same time, inefficiencies, errors, and lack of speed are among the most frequent causes for consumer dissatisfaction that hamper customer service performances.

Advanced conversational AI agents are the right tool to reduce these inefficiencies and errors while delivering strong customer service performances at light speed.

AN: What kind of real-time feedback is provided to your clients about their customers’ behaviour?

SS: Recognising the importance of a hybrid environment, where human and machine interaction are wisely mixed to leverage the best of both worlds, our Crafter.ai platform has been designed from the ground up with a module that manages the handover of the conversations between the bot and the call centre agents.

During a conversation, a platform user – with the right authorisation levels – can access an insights dashboard to check the key performance indicators that have been identified for the bot.

This is also true during the handover when agents and their supervisors receive real-time information on the customer behaviour during the company site navigation. Such information includes – and is not limited to – visited pages, form field contents, and clicked CTAs, and can be complemented with data collected from the company CRM.

AN: Europe is home to some of the strictest data regulations in the world. As a European organisation, do you think such regulations are too strict, not strict enough, or about right?

SS: We think that any company that wants to gain the trust of their customers should do their best to go beyond the strict regulations requirements.

AN: As conversational AIs progress to human-like levels, should it always be made clear that a person is speaking to an AI bot?

SS: Yes, a bot should always make clear that it is not human. In the end, this can help realise how amazing they can perform.

AN: What’s next for Athics?

SS: We have a solid roadmap for Crafter.ai with many new features and improvements that we bring every three months to our platform.

Our sole focus is on advanced conversational AI agents. We are currently working to include more and more domain specific capabilities to our bots.

Advanced profiling capabilities is a great area of interest where, thanks to our collaboration with universities and international research centres, we expect to deliver truly innovative solutions to our customers.

AN: Athics is sponsoring and exhibiting at this year’s AI & Big Data Expo Europe. What can attendees expect from your presence at the event? 

SS: Conversational AI agents allow businesses to obtain a balance between optimising resources and giving a top-class customer experience. Although there is no doubt regarding the benefits of adopting virtual agents, the successful integration across a company’s conversational streams needs to be correctly assessed, planned, and executed in order to leverage the full potential.

Athics will be at stand number 280 to welcome attending companies and give an overview of the advantages of integrating a conversational agent, explain how to choose the right product, and how to create a conversational vision that can scale and address organisational goals.

(Photo by Jason Leung on Unsplash)

Athics will be sharing their invaluable insights during this year’s AI & Big Data Expo Global which runs from 23-24 November 2021. Athics’ booth number is 280. Find out more about the event here.

The post Stefano Somenzi, Athics: On no-code AI and deploying conversational bots appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2021/11/12/stefano-somenzi-athics-no-code-ai-deploying-conversational-bots/feed/ 0
BT uses epidemiological modelling for new cyberattack-fighting AI https://www.artificialintelligence-news.com/2021/11/12/bt-epidemiological-modelling-new-cyberattack-fighting-ai/ https://www.artificialintelligence-news.com/2021/11/12/bt-epidemiological-modelling-new-cyberattack-fighting-ai/#respond Fri, 12 Nov 2021 14:58:18 +0000 https://artificialintelligence-news.com/?p=11359 BT is deploying an AI trained on epidemiological modelling to fight the increasing risk of cyberattacks. The first mathematical epidemic model was formulated and solved by Daniel Bernoulli in 1760 to evaluate the effectiveness of variolation of healthy people with the smallpox virus. More recently, such models have guided COVID-19 responses to keep the health... Read more »

The post BT uses epidemiological modelling for new cyberattack-fighting AI appeared first on AI News.

]]>
BT is deploying an AI trained on epidemiological modelling to fight the increasing risk of cyberattacks.

The first mathematical epidemic model was formulated and solved by Daniel Bernoulli in 1760 to evaluate the effectiveness of variolation of healthy people with the smallpox virus. More recently, such models have guided COVID-19 responses to keep the health and economic damage from the pandemic as minimal as possible.

Now security researchers from BT Labs in Suffolk want to harness centuries of epidemiological modelling advancements to protect networks.

BT’s new epidemiology-based cybersecurity prototype is called Inflame and uses deep reinforcement learning to help enterprises automatically detect and respond to cyberattacks before they compromise a network.

Howard Watson, Chief Technology Officer at BT, said:

“We know the risk of cyberattack is higher than ever and has intensified significantly during the pandemic. Enterprises now need to look to new cybersecurity solutions that can understand the risk and consequence of an attack, and quickly respond before it’s too late.

Epidemiological testing has played a vital role in curbing the spread of infection during the pandemic, and Inflame uses the same principles to understand how current and future digital viruses spread through networks.

Inflame will play a key role in how BT’s Eagle-i platform automatically predicts and identifies cyber-attacks before they impact, protecting customers’ operations and reputation.” 

The ‘R’ rate – used for indicating the estimated rate of further infection per case – has gone from the lexicons of epidemiologists to public knowledge over the course of the pandemic. Alongside binge-watching Tiger King, a lockdown pastime for many of us was to check the latest R rate in the hope that it had dropped below 1—meaning the spread of COVID-19 was decreasing rather than increasing exponentially.

For its Inflame prototype, BT’s team built models that were used to test numerous scenarios based on differing R rates of cyber-infection.

Inflame can automatically model and respond to a detected threat within an enterprise network thanks to its deep reinforcement training.

Responses are underpinned by “attack lifecycle” modelling – similar to understanding the spread of a biological virus – to determine the current stage of a cyberattack by assessing real-time security alerts against recognised patterns. The ability to predict the next stage of a cyberattack helps with determining the best steps to halt its progress.

Last month, BT announced its Eagle-i platform which uses AI for real-time threat detection and intelligent response. Eagle-i “self-learns” from every intervention to constantly improve its threat knowledge and Inflame will be a key component in further improving the platform.

(Photo by Erik Mclean on Unsplash)

Looking to revamp your digital transformation strategy? Learn more about the Digital Transformation Week event taking place in Amsterdam on 23-24 November 2021 and discover key strategies for making your digital efforts a success.

The post BT uses epidemiological modelling for new cyberattack-fighting AI appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2021/11/12/bt-epidemiological-modelling-new-cyberattack-fighting-ai/feed/ 0
Vincent Chio, Shopify: On using AI to revolutionise the retail industry https://www.artificialintelligence-news.com/2021/11/04/vincent-chio-shopify-on-using-ai-to-revolutionise-the-retail-industry/ https://www.artificialintelligence-news.com/2021/11/04/vincent-chio-shopify-on-using-ai-to-revolutionise-the-retail-industry/#respond Thu, 04 Nov 2021 14:54:36 +0000 https://artificialintelligence-news.com/?p=11289 It’s always interesting to hear how AI is revolutionising specific industries, and few companies are more qualified to comment on the impact on the retail industry than Shopify. AI News caught up with Vincent Chio, Data Science Lead at Shopify, to hear what the company is doing in the space and how AI is improving... Read more »

The post Vincent Chio, Shopify: On using AI to revolutionise the retail industry appeared first on AI News.

]]>
It’s always interesting to hear how AI is revolutionising specific industries, and few companies are more qualified to comment on the impact on the retail industry than Shopify.

AI News caught up with Vincent Chio, Data Science Lead at Shopify, to hear what the company is doing in the space and how AI is improving the end-to-end retail experience.

AI News: How has AI changed the shopping experience in recent years for both sellers and buyers?

Vincent Chio: From automated marketing, to smart fulfilment, and predictive analytics, AI has made it easier for retailers to manage and grow their business. But not every retailer has the resources to build and implement AI in their business.

That’s why at Shopify, we really focus on bringing the power of AI—which has historically been reserved for enterprise businesses—into the hands of our merchants, who are businesses of all sizes. To provide a few examples of what that looks like, we use machine learning in our fulfilment network to predict the closest fulfilment centres and optimal inventory quantities per location to ensure fast, low-cost delivery of our merchants’ products. Our business chat app, Shopify Inbox, is built on a natural language processing foundation that helps merchants convert conversations into sales. And, through state-of-the-art machine learning models that predict merchant business success, our product Shopify Capital automatically sends our merchants funding offers without them having to apply.

How this shows up for and benefits buyers is in the form of more accessible, seamless and personalized shopping experiences. From targeted product recommendations, to faster shipping and better ways to connect, buyers expect more from retailers, and AI is helping retailers meet those expectations. 

AN: What are some of the hurdles you’ve faced in bringing an AI model into production and how did you overcome them?

VC: When it comes to shipping anything, there are always hurdles you’re bound to face. At Shopify, we follow a few guiding principles for implementing and scaling AI that ensure easy adoption from the get-go. 

At Shopify, we take a merchant-first approach to identifying problem areas. So our first principle for implementing and scaling AI is to make sure that what we’re building is solving a merchant problem, and that we have enough data to create a solution. 

Second, we start simple. If a regression model will solve our merchant problem, that’s where we start. This doesn’t mean we avoid building complex models, it just means that we first prove that we can use AI/ML to solve the problem, and then we iterate by building complex models. 

These two steps are key for getting stakeholder buy-in which, if you don’t get, can stop your project before it even gets off the ground. We’ve got more principles and tips that you can check out here

AN: An increasing number of third-party integrations are available for Shopify that harness AI—what are some of the most unique and/or interesting ones in your view?

VC: We’ve got a ton of third-party developers creating innovative apps that help extend the capabilities of our merchants’ stores. Some of the most interesting ones, in my opinion, are the ones using AI to translate those authentic, in-person retail experiences to online stores. Like I mentioned earlier, AI is changing the shopping experience by using data to bring more personalization to retail.

Our third-party apps that use algorithms to help merchants optimize the full buyer journey, from marketing and conversational automation, to recommendations and cart abandonment, not only have the power to help merchants convert, but they’re creating better shopping experiences for buyers.

AN: Shopify introduced LinNét earlier this year, its new product categorisation model. How does the new model differ from its predecessor and why was it deemed necessary?

VC: Shopify has seen amazing merchant growth in the past 2 years, hitting over 1.7 million merchants across the world. New merchants means new products, so we decided to reevaluate our existing product categorization model. We wanted to evaluate our old model’s performance because it’s important that we understand what our merchants are selling, so we can build the best products that help power their business growth.

After evaluating key metrics like how often our predictions were correct and how often do we provide a prediction, while also taking a look at our product road map and how our model might support new products, we decided to build a new model to improve our performance. 

Compared to our old model, LinNét not only uses text features for prediction but also images. On top of this, LinNét has the ability to understand products in multiple languages. LinNét was also part of a larger effort to modernize our machine learning systems and we can now do things like real-time prediction, which we couldn’t do with the previous model.

With these new features, LinNét has increased our leaf precision by 8% while doubling our coverage. If you’re interested in learning more, read our blog!

AN: What’s next for AI at Shopify?

VC: We’re excited to further leverage the scale of our data to not only empower Shopify but to create new experiences for our merchants that are impossible without data.

Some of the ways we’re doing that is by exploring how to better support merchant workflows through product understanding and creating experiences for merchants that suggest best actions for their workflows, while foregrounding merchant autonomy.

How that will show up is through eliciting merchant feedback through accepting or rejecting our recommendations, and through education around our machine learning approaches to workflow optimization.

AN: Shopify is sponsoring, speaking, and exhibiting at this year’s AI & Big Data Expo Europe. What can attendees expect from your presence at the event?

VC: Attendees can expect the chance to really get to know the Shopify Data Science & Engineering team, and the kind of work we do. 

On day one of the expo, you’ll get to hear more from me! Diving into Shopify Inbox and the natural language processing foundation behind the product, I’ll illustrate how we accelerate product development with AI at Shopify, providing takeaways that can be used at any organization. I’ll also cover how to build a data foundation to establish trust and identify opportunities only AI can solve at scale.

Then, in the afternoon, you’ll have the chance to hear from Shopify’s Yizhar Toren, Senior Data Scientist. Yizhar will join a panel conversation on ramping up AI projects, discussing key tips like how to move your project from experimentation to production, and turning AI into ROI.

Attendees will also get the chance to meet our speakers and recruiters at our booth on the exhibition floor. Come say hi at booth #301!

(Photo by Mike Petrucci on Unsplash)

Shopify will be sharing its invaluable insights during this year’s AI & Big Data Expo Europe which runs from 23-24 November 2021. Shopify’s booth number is 301. Find out more about the event here.

The post Vincent Chio, Shopify: On using AI to revolutionise the retail industry appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2021/11/04/vincent-chio-shopify-on-using-ai-to-revolutionise-the-retail-industry/feed/ 0
QuEST partners with NVIDIA to deliver next-gen AI solutions in Japan https://www.artificialintelligence-news.com/2021/10/26/quest-partners-nvidia-deliver-next-gen-ai-solutions-japan/ https://www.artificialintelligence-news.com/2021/10/26/quest-partners-nvidia-deliver-next-gen-ai-solutions-japan/#respond Tue, 26 Oct 2021 14:24:22 +0000 http://artificialintelligence-news.com/?p=11261 QuEST has extended its partnership with NVIDIA to accelerate the digital transformation of Japanese businesses with next-gen AI solutions. NVIDIA named QuEST an Elite Service Delivery Partner in the NVIDIA Partner Network (NPN) back in June. Through NPN, QuEST has early access to NVIDIA platforms, software, solutions, workshops, and technology updates. The previous agreement only... Read more »

The post QuEST partners with NVIDIA to deliver next-gen AI solutions in Japan appeared first on AI News.

]]>
QuEST has extended its partnership with NVIDIA to accelerate the digital transformation of Japanese businesses with next-gen AI solutions.

NVIDIA named QuEST an Elite Service Delivery Partner in the NVIDIA Partner Network (NPN) back in June.

Through NPN, QuEST has early access to NVIDIA platforms, software, solutions, workshops, and technology updates. The previous agreement only covered the USA but NVIDIA has now extended the collaboration to Japan.

Masataka Osaki, Japan Country Manager and Vice President of Corporate Sales at NVIDIA, said:

“We are pleased to welcome QuEST as an NPN Elite Partner not only in the US, but also in Japan.

The NPN Elite-level status is reserved for partners who demonstrate a history of expertise in the areas of artificial intelligence, machine learning, and deep learning.

We hope that QuEST’s solutions and services, based on NVIDIA’s AI technology, will further boost the Japanese industry.”

QuEST has wasted no time in taking advantage of the benefits of being an NPN member.

Using NVIDIA DGX systems, QuEST has trained custom vision AI models that are deployed for high-speed edge inference. Customers are able to begin enhancing their operations and decision-making through rapid proof-of-concept deployments. 

Rajeev Nair, Vice President and Head of Japan Business, QuEST Global, commented:

“We are extremely proud that our NPN Elite partner status has been extended to Japan. QuEST is already engaged with key Japanese customers in high-tech, medical devices, power, and automotive domains providing engineering and digital services. 

The NPN partnership will help us further our efforts and provide the best to our customers in Japan.” 

NVIDIA and QuEST have established a deep relationship over the years. QuEST has been part of NVIDIA’s Jetson Partner Ecosystem since 2018 and was one of the first companies to be selected for the NVIDIA Deep Learning Consulting Partnership Program.

In 2019, QuEST debuted a groundbreaking solution to detect lung cancer nodules from CT scans. The solution uses the NVIDIA Jetson platform for deep neural network training and validation to develop models that enhance the accuracy of CT image analysis compared to conventional image processing methods.

“QuEST’s collaboration with NVIDIA in Japan will help accelerate AI-based digital transformation across our customers,” added Nair. “We look forward to working with NVIDIA to spur technology-driven business innovation and growth for customers across industries.”

(Photo by Jase Bloor on Unsplash)

Find out more about Digital Transformation Week North America, taking place on 9-10 November 2021, a virtual event and conference exploring advanced DTX strategies for a ‘digital everything’ world.

The post QuEST partners with NVIDIA to deliver next-gen AI solutions in Japan appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2021/10/26/quest-partners-nvidia-deliver-next-gen-ai-solutions-japan/feed/ 0
BenevolentAI’s drug discovery platform identifies novel target for ulcerative colitis https://www.artificialintelligence-news.com/2021/10/14/benevolentai-drug-discovery-platform-identifies-novel-target-ulcerative-colitis/ https://www.artificialintelligence-news.com/2021/10/14/benevolentai-drug-discovery-platform-identifies-novel-target-ulcerative-colitis/#respond Thu, 14 Oct 2021 08:57:44 +0000 http://artificialintelligence-news.com/?p=11236 London-based AI pioneer BenevolentAI has identified a novel target for ulcerative colitis through its drug discovery platform. The candidate was identified by scientists who used BenevolentAI’s target ID tools and machine learning models to identify and experimentally validate a novel biological target. Impressively, the achievement was made without any prior reference in published alliteration or... Read more »

The post BenevolentAI’s drug discovery platform identifies novel target for ulcerative colitis appeared first on AI News.

]]>
London-based AI pioneer BenevolentAI has identified a novel target for ulcerative colitis through its drug discovery platform.

The candidate was identified by scientists who used BenevolentAI’s target ID tools and machine learning models to identify and experimentally validate a novel biological target. Impressively, the achievement was made without any prior reference in published alliteration or patents linking the gene to ulcerative colitis.

Anne Phelan, Chief Scientific Officer at BenevolentAI, said:

“Ulcerative colitis is a chronic, lifelong disease that affects 0.2% of the US population alone and 1.6 million patients in the seven major markets, yet it is poorly served by the standard of care therapies.

Our novel preclinical candidate addresses the high unmet need for an oral, safe and efficacious therapy and has demonstrated improved safety and tolerability profile compared with other leading IBD treatments.

We are actively using patient-derived molecular descriptors to target patient subgroups that will optimise trial design and further increase our probability of success.”

Following the identification of the candidate, BenevolentAI’s molecular design capabilities were used to generate a potential oral and peripherally-restricted candidate drug. The preclinical candidate has been experimentally validated in ex-vivo ulcerative colitis colon samples from patients who didn’t respond to current treatments.

Joanna Shields, Chief Executive Officer of BenevolentAI, commented: “Nominating a drug candidate for a novel ulcerative colitis target identified by our AI-drug discovery platform represents a milestone for BenevolentAI but, more importantly, advances a new potential treatment for this debilitating disease.” 

BenevolentAI plans to advance the asset into clinical trials in early 2023.

(Photo by National Cancer Institute on Unsplash)

Find out more about Digital Transformation Week North America, taking place on 9-10 November 2021, a virtual event and conference exploring advanced DTX strategies for a ‘digital everything’ world.

The post BenevolentAI’s drug discovery platform identifies novel target for ulcerative colitis appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2021/10/14/benevolentai-drug-discovery-platform-identifies-novel-target-ulcerative-colitis/feed/ 0
Enterprise AI platform Dataiku announces fully-managed online service https://www.artificialintelligence-news.com/2021/06/14/enterprise-ai-platform-dataiku-announces-fully-managed-online-service/ https://www.artificialintelligence-news.com/2021/06/14/enterprise-ai-platform-dataiku-announces-fully-managed-online-service/#respond Mon, 14 Jun 2021 15:40:15 +0000 http://artificialintelligence-news.com/?p=10686 Dataiku has announced a fully-managed online version of its enterprise AI platform to help smaller companies get started. The data science platform enables raw data to be converted into actionable insights through data visualisation or the creation of dashboards and also supports training machine learning models. “Accessibility has always been of the utmost importance at... Read more »

The post Enterprise AI platform Dataiku announces fully-managed online service appeared first on AI News.

]]>
Dataiku has announced a fully-managed online version of its enterprise AI platform to help smaller companies get started.

The data science platform enables raw data to be converted into actionable insights through data visualisation or the creation of dashboards and also supports training machine learning models.

“Accessibility has always been of the utmost importance at Dataiku. We developed Dataiku Online to address the needs of small and midsize businesses, in addition to startups,” said Florian Douetteau, CEO of Dataiku.

Historically, Dataiku has targeted large enterprises with the resources to deploy and manage its platform—companies which include Unilever, GE, Cisco, BNP Paribas, and over 400 others.

The new online version enables smaller companies to use Dataiku’s platform without the need for dedicated administrators and using their own infrastructure.

Douetteau added:

“We want to help companies that are just beginning their data and analytics journey to access the full power of our platform, where they can start by enhancing their day-to-day operations with simple data tools and then take their data even further with machine learning.

Companies don’t need big data to do big things with their data, and Dataiku Online will make it easier for a whole new class of companies — from lean startups to scaling SMBs — to start.”

Cloud data stack and storage tools such as those from Snowflake, Amazon Redshift, Google BigQuery, and more can be integrated with Dataiku’s online platform. In fact, a pre-integrated version of the platform can be found in the Snowflake Marketplace.

Scott Walker, Managing Partner at early Dataiku Online customer Sarissa Partners, commented:

“Dataiku Online allows us to focus on analysis, not server administration. The customer service is fast as well.

Data insights fuel our growth, and Dataiku Online enables us to develop insights faster than our competitors.”

To help smaller companies access the resources of their bigger competitors, Dataiku has launched an offering specifically for startups.

Seed-stage companies — those less than two years old, or with $5M or less in funding — and startups founded less than five years ago or with less than $10M in funding, are eligible for discounted pricing.

A 14-day free trial of Dataiku is available for companies of all sizes here.

Find out more about Digital Transformation Week North America, taking place on November 9-10 2021, a virtual event and conference exploring advanced DTX strategies for a ‘digital everything’ world.

The post Enterprise AI platform Dataiku announces fully-managed online service appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2021/06/14/enterprise-ai-platform-dataiku-announces-fully-managed-online-service/feed/ 0
Google launches fully managed cloud ML platform Vertex AI https://www.artificialintelligence-news.com/2021/05/19/google-launches-fully-managed-cloud-ml-platform-vertex-ai/ https://www.artificialintelligence-news.com/2021/05/19/google-launches-fully-managed-cloud-ml-platform-vertex-ai/#respond Wed, 19 May 2021 15:33:44 +0000 http://artificialintelligence-news.com/?p=10578 Google Cloud has launched Vertex AI, a fully managed cloud platform that simplifies the deployment and maintenance of machine learning models. Vertex was announced during this year’s virtual I/O developer conference and somewhat breaks from Google’s tradition of using its keynote to focus more on updates to its mobile and web development solutions. Google announcing... Read more »

The post Google launches fully managed cloud ML platform Vertex AI appeared first on AI News.

]]>
Google Cloud has launched Vertex AI, a fully managed cloud platform that simplifies the deployment and maintenance of machine learning models.

Vertex was announced during this year’s virtual I/O developer conference and somewhat breaks from Google’s tradition of using its keynote to focus more on updates to its mobile and web development solutions. Google announcing the platform during the keynote shows how important the company believes it to be for a wide range of developers.

Google claims that using Vertex enables models to be trained with up to 80 percent fewer lines of code when compared to competing platforms.

Bradley Shimmin, Chief Analyst for AI Platforms, Analytics, and Data Management at Omdia, said:

“Data science practitioners hoping to put AI to work across the enterprise aren’t looking to wrangle tooling. Rather, they want tooling that can tame the ML lifecycle. Unfortunately, that is no small order.

It takes a supportive infrastructure capable of unifying the user experience, plying AI itself as a supportive guide, and putting data at the very heart of the process — all while encouraging the flexible adoption of diverse technologies.”

Vertex brings together Google Cloud’s AI solutions into a single environment where models can go from experimentation all the way to production.

Andrew Moore, VP and GM of Cloud AI and Industry Solutions at Google Cloud, said:

“We had two guiding lights while building Vertex AI: get data scientists and engineers out of the orchestration weeds, and create an industry-wide shift that would make everyone get serious about moving AI out of pilot purgatory and into full-scale production.

We are very proud of what we came up with in this platform, as it enables serious deployments for a new generation of AI that will empower data scientists and engineers to do fulfilling and creative work.”

Vertex provides access to Google’s MLOps toolkit which the company uses internally for workloads involving computer vision, conversation, and language.

Other MLOps features supported by Vertex include Vizier, which increases the rate of experimentation; Feature Store to help practitioners serve, share, and reuse ML features; and Experiments to accelerate the deployment of models into production with faster model selection.

Some high-profile companies were given early access to Vertex. Among them is ModiFace, a part of L’Oréal that focuses on the use of AR and AI to revolutionise the beauty industry.

Jeff Houghton, COO at ModiFace, said:

“We provide an immersive and personalized experience for people to purchase with confidence whether it’s a virtual try-on at web check out, or helping to understand what brand product is right for each individual.

With more and more of our users looking for information at home, on their phone, or at any other touchpoint, Vertex AI allowed us to create technology that is incredibly close to actually trying the product in real life.”

ModiFace uses Vertex to train AI models for all of its new services. For example, the company’s skin diagnostic service is trained on thousands of images from L’Oréal’s Research & Innovation arm and is combined with ModiFace’s AI algorithm to create tailor-made skincare routines.

Another firm that is benefiting from Vertex’s capabilities is Essence, a media agency that is part of London-based global advertising and communications giant WPP.

With Vertex AI, Essence’s developers and data analysts are able to regularly update models to keep pace with the rapidly-changing world of human behaviours and channel content.

Those are just two examples of companies whose operations are already being greatly enhanced through the use of Vertex. Now the floodgates have been opened, we’re sure there’ll be many more stories over the coming years and we can’t wait to hear about them.

You can learn how to get started with Vertex AI here.

(Photo by John Baker on Unsplash)

Interested in hearing industry leaders discuss subjects like this? Attend the co-located 5G Expo, IoT Tech Expo, Blockchain Expo, AI & Big Data Expo, and Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London, and Amsterdam.

The post Google launches fully managed cloud ML platform Vertex AI appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2021/05/19/google-launches-fully-managed-cloud-ml-platform-vertex-ai/feed/ 0
Bosch partners with Fetch.ai to ‘transform’ digital ecosystems using DLTs https://www.artificialintelligence-news.com/2021/02/18/bosch-partners-fetch-ai-transform-digital-ecosystems-dlts/ https://www.artificialintelligence-news.com/2021/02/18/bosch-partners-fetch-ai-transform-digital-ecosystems-dlts/#respond Thu, 18 Feb 2021 12:56:00 +0000 http://artificialintelligence-news.com/?p=10280 Bosch has partnered with Cambridge-based AI blockchain startup Fetch.ai with the aim of transforming existing digital ecosystems using distributed ledger technologies (DLTs). The global engineering giant will test key features of Fetch.ai’s testnet until the end of this month and will deploy a node on the network. The strategic engineering project between Fetch.ai and Bosch... Read more »

The post Bosch partners with Fetch.ai to ‘transform’ digital ecosystems using DLTs appeared first on AI News.

]]>
Bosch has partnered with Cambridge-based AI blockchain startup Fetch.ai with the aim of transforming existing digital ecosystems using distributed ledger technologies (DLTs).

The global engineering giant will test key features of Fetch.ai’s testnet until the end of this month and will deploy a node on the network. The strategic engineering project between Fetch.ai and Bosch is called the Economy of Things (EoT).

Dr Alexander Poddey, the leading researcher for digital socio-economy, cryptology, and artificial intelligence in the EoT project, said:

“Our collaboration with Fetch.ai spans from the aspects of governance and orchestration of DLT-based ecosystems, multi-agent technologies to collective learning.

They share our belief that these elements are crucial to realising the economic, social, and environmental benefits of IoT technologies.”

Fetch.ai’s testnet launched in October 2020 and the firm is now gearing up for its mainnet launch in March. The company has been ramping up announcements in advance of the mainnet launch and just last week announced a partnership with FESTO to launch a decentralised marketplace for manufacturing.

After the mainnet launch, Bosch intends to run nodes and applications on Fetch.ai’s blockchain network.

Jonathan Ward, CTO of Fetch.ai, commented:

“We have been working with Bosch for some time towards our shared vision of building open, fair, and transparent digital ecosystems. I’m delighted to be able to announce the first public step in bringing these technologies into the real world.

We’re looking forward to working further with Bosch to bring about the wide adoption of these ground-breaking innovations, which will hugely benefit consumers and businesses in many industries including automotive, manufacturing, and healthcare.” 

Fetch.ai is working on decentralised autonomous “agents” which perform real-world tasks. 

Bosch is attracted to Fetch.ai’s vision of collective learning technologies and believes it can be a key enabler in their plans for AI-enabled devices—allowing AI agents to be trained which operate within smart devices while preserving users’ privacy and control of their data.

Fetch.ai’s vision is bold but it has the team and partnerships to pull it off. The company’s roster features talent with experience from DeepMind, Siemens, Sony, and a number of esteemed academic institutions.

Bosch has long expressed a keen interest in distributed ledger technologies and established multiple industry partnerships.

The venture capital arm of Bosch, Robert Bosch Venture-Capital, invested in the IOTA Foundation. Bosch later patented an IOTA-based digital payments system and recently financially supported a hackathon for the DLT platform which uses a scalable DAG (Directed Acyclic Graph) data structure called the ‘Tangle’ in a bid to overcome some of the historic problems with early blockchains.

Fetch.ai and IOTA are in the same space but have different goals, it’s not a choice of one or the other. Companies like Bosch can take advantage of the exciting potential offered by both DLTs to gain a competitive edge.

(Photo by Adi Goldstein on Unsplash)

Interested in hearing industry leaders discuss subjects like this? Attend the co-located 5G Expo, IoT Tech Expo, Blockchain Expo, AI & Big Data Expo, and Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London, and Amsterdam.

The post Bosch partners with Fetch.ai to ‘transform’ digital ecosystems using DLTs appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2021/02/18/bosch-partners-fetch-ai-transform-digital-ecosystems-dlts/feed/ 0
AWS announces nine major updates for its ML platform SageMaker https://www.artificialintelligence-news.com/2020/12/09/aws-nine-major-updates-ml-platform-sagemaker/ https://www.artificialintelligence-news.com/2020/12/09/aws-nine-major-updates-ml-platform-sagemaker/#comments Wed, 09 Dec 2020 14:47:48 +0000 http://artificialintelligence-news.com/?p=10096 Amazon Web Services (AWS) has announced nine major new updates for its cloud-based machine learning platform, SageMaker. SageMaker aims to provide a machine learning service which can be used to build, train, and deploy ML models for virtually any use case. During this year’s re:Invent conference, AWS made several announcements to further improve SageMaker’s capabilities.... Read more »

The post AWS announces nine major updates for its ML platform SageMaker appeared first on AI News.

]]>
Amazon Web Services (AWS) has announced nine major new updates for its cloud-based machine learning platform, SageMaker.

SageMaker aims to provide a machine learning service which can be used to build, train, and deploy ML models for virtually any use case.

During this year’s re:Invent conference, AWS made several announcements to further improve SageMaker’s capabilities.

Swami Sivasubramanian, VP of Amazon Machine Learning at AWS, said:

“Hundreds of thousands of everyday developers and data scientists have used our industry-leading machine learning service, Amazon SageMaker, to remove barriers to building, training, and deploying custom machine learning models. One of the best parts about having such a widely-adopted service like SageMaker is that we get lots of customer suggestions which fuel our next set of deliverables.

Today, we are announcing a set of tools for Amazon SageMaker that makes it much easier for developers to build end-to-end machine learning pipelines to prepare, build, train, explain, inspect, monitor, debug, and run custom machine learning models with greater visibility, explainability, and automation at scale.”

The first announcement is Data Wrangler, a feature which aims to automate the preparation of data for machine learning.

Data Wrangler enables customers to choose the data they want from their various data stores and import it with a single click. Over 300 built-in data transformers are included to help customers normalise, transform, and combine features without having to write any code.

Frank Farrall, Principal of AI Ecosystems and Platforms Leader at Deloitte, comments:

“SageMaker Data Wrangler enables us to hit the ground running to address our data preparation needs with a rich collection of transformation tools that accelerate the process of machine learning data preparation needed to take new products to market.

In turn, our clients benefit from the rate at which we scale deployments, enabling us to deliver measurable, sustainable results that meet the needs of our clients in a matter of days rather than months.”

The second announcement is Feature Store. Amazon SageMaker Feature Store provides a new repository that makes it easy to store, update, retrieve, and share machine learning features for training and inference.

Feature Store aims to overcome the problem of storing features which are mapped to multiple models. A purpose-built feature store helps developers to access and share features that make it much easier to name, organise, find, and share sets of features among teams of developers and data scientists. Because it resides in SageMaker Studio – close to where ML models are run – AWS claims it provides single-digit millisecond inference latency.

Mammad Zadeh, VP of Engineering, Data Platform at Intuit, says:

“We have worked closely with AWS in the lead up to the release of Amazon SageMaker Feature Store, and we are excited by the prospect of a fully managed feature store so that we no longer have to maintain multiple feature repositories across our organization.

Our data scientists will be able to use existing features from a central store and drive both standardisation and reuse of features across teams and models.”

Next up, we have SageMaker Pipelines—which claims to be the first purpose-built, easy-to-use continuous integration and continuous delivery (CI/CD) service for machine learning.

Developers can define each step of an end-to-end machine learning workflow including the data-load steps, transformations from Amazon SageMaker Data Wrangler, features stored in Amazon SageMaker Feature Store, training configuration and algorithm set up, debugging steps, and optimisation steps.

SageMaker Clarify may be one of the most important features being debuted by AWS this week considering ongoing events.

Clarify aims to provide bias detection across the machine learning workflow, enabling developers to build greater fairness and transparency into their ML models. Rather than turn to often time-consuming open-source tools, developers can use the integrated solution to quickly try and counter any bias in models.

Andreas Heyden, Executive VP of Digital Innovations for the DFL Group, says:

“Amazon SageMaker Clarify seamlessly integrates with the rest of the Bundesliga Match Facts digital platform and is a key part of our long-term strategy of standardising our machine learning workflows on Amazon SageMaker.

By using AWS’s innovative technologies, such as machine learning, to deliver more in-depth insights and provide fans with a better understanding of the split-second decisions made on the pitch, Bundesliga Match Facts enables viewers to gain deeper insights into the key decisions in each match.”

Deep Profiling for Amazon SageMaker automatically monitors system resource utilisation and provides alerts where required for any detected training bottlenecks. The feature works across frameworks (PyTorch, Apache MXNet, and TensorFlow) and collects system and training metrics automatically without requiring any code changes in training scripts.

Next up, we have Distributed Training on SageMaker which AWS claims makes it possible to train large, complex deep learning models up to two times faster than current approaches.

Kristóf Szalay, CTO at Turbine, comments:

“We use machine learning to train our in silico human cell model, called Simulated Cell, based on a proprietary network architecture. By accurately predicting various interventions on the molecular level, Simulated Cell helps us to discover new cancer drugs and find combination partners for existing therapies.

Training of our simulation is something we continuously iterate on, but on a single machine each training takes days, hindering our ability to iterate on new ideas quickly.

We are very excited about Distributed Training on Amazon SageMaker, which we are expecting to decrease our training times by 90% and to help us focus on our main task: to write a best-of-the-breed codebase for the cell model training.

Amazon SageMaker ultimately allows us to become more effective in our primary mission: to identify and develop novel cancer drugs for patients.”

SageMaker’s Data Parallelism engine scales training jobs from a single GPU to hundreds or thousands by automatically splitting data across multiple GPUs, improving training time by up to 40 percent.

With edge computing advancements increasing rapidly, AWS is keeping pace with SageMaker Edge Manager.

Edge Manager helps developers to optimise, secure, monitor, and maintain ML models deployed on fleets of edge devices. In addition to helping optimise ML models and manage edge devices, Edge Manager also provides the ability to cryptographically sign models, upload prediction data from devices to SageMaker for monitoring and analysis, and view a dashboard which tracks and provided a visual report on the operation of the deployed models within the SageMaker console.

Igor Bergman, VP of Cloud and Software of PCs and Smart Devices at Lenovo, comments:

“SageMaker Edge Manager will help eliminate the manual effort required to optimise, monitor, and continuously improve the models after deployment. With it, we expect our models will run faster and consume less memory than with other comparable machine-learning platforms.

As we extend AI to new applications across the Lenovo services portfolio, we will continue to require a high-performance pipeline that is flexible and scalable both in the cloud and on millions of edge devices. That’s why we selected the Amazon SageMaker platform. With its rich edge-to-cloud and CI/CD workflow capabilities, we can effectively bring our machine learning models to any device workflow for much higher productivity.”

Finally, SageMaker JumpStart aims to make it easier for developers which have little experience with machine learning deployments to get started.

JumpStart provides developers with an easy-to-use, searchable interface to find best-in-class solutions, algorithms, and sample notebooks. Developers can select from several end-to-end machine learning templates(e.g. fraud detection, customer churn prediction, or forecasting) and deploy them directly into their SageMaker Studio environments.

AWS has been on a roll with SageMaker improvements—delivering more than 50 new capabilities over the past year. After this bumper feature drop, we probably shouldn’t expect any more until we’ve put 2020 behind us.

You can find coverage of AWS’ more cloud-focused announcements via our sister publication CloudTech here.

Interested in hearing industry leaders discuss subjects like this? Attend the co-located 5G Expo, IoT Tech Expo, Blockchain Expo, AI & Big Data Expo, and Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London, and Amsterdam.

The post AWS announces nine major updates for its ML platform SageMaker appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2020/12/09/aws-nine-major-updates-ml-platform-sagemaker/feed/ 1