data Archives - AI News https://www.artificialintelligence-news.com/tag/data/ Artificial Intelligence News Thu, 15 Jun 2023 14:09:48 +0000 en-GB hourly 1 https://www.artificialintelligence-news.com/wp-content/uploads/sites/9/2020/09/ai-icon-60x60.png data Archives - AI News https://www.artificialintelligence-news.com/tag/data/ 32 32 Stephen Almond, ICO: Prioritise privacy when adopting generative AI https://www.artificialintelligence-news.com/2023/06/15/stephen-almond-ico-prioritise-privacy-adopting-generative-ai/ https://www.artificialintelligence-news.com/2023/06/15/stephen-almond-ico-prioritise-privacy-adopting-generative-ai/#respond Thu, 15 Jun 2023 14:09:46 +0000 https://www.artificialintelligence-news.com/?p=13197 The Information Commissioner’s Office (ICO) is urging businesses to prioritise privacy considerations when adopting generative AI technology. According to new research, generative AI has the potential to become a £1 trillion market within the next ten years, offering significant benefits to both businesses and society. However, the ICO emphasises the need for organisations to be... Read more »

The post Stephen Almond, ICO: Prioritise privacy when adopting generative AI appeared first on AI News.

]]>
The Information Commissioner’s Office (ICO) is urging businesses to prioritise privacy considerations when adopting generative AI technology.

According to new research, generative AI has the potential to become a £1 trillion market within the next ten years, offering significant benefits to both businesses and society. However, the ICO emphasises the need for organisations to be aware of the associated privacy risks.

Stephen Almond, the Executive Director of Regulatory Risk at the ICO, highlighted the importance of recognising the opportunities presented by generative AI while also understanding the potential risks.

“Businesses are right to see the opportunity that generative AI offers, whether to create better services for customers or to cut the costs of their services. But they must not be blind to the privacy risks,” says Almond.

“Spend time at the outset to understand how AI is using personal information, mitigate any risks you become aware of, and then roll out your AI approach with confidence that it won’t upset customers or regulators.”

Generative AI works by generating content based on extensive data collection from publicly accessible sources, including personal information. Existing laws already safeguard individuals’ rights, including privacy, and these regulations extend to emerging technologies such as generative AI.

In April, the ICO outlined eight key questions that organisations using or developing generative AI that processes personal data should be asking themselves. The regulatory body is committed to taking action against organisations that fail to comply with data protection laws.

Almond reaffirms the ICO’s stance, stating that they will assess whether businesses have effectively addressed privacy risks before implementing generative AI, and will take action if there is a potential for harm resulting from the misuse of personal data. He emphasises that businesses must not overlook the risks to individuals’ rights and freedoms during the rollout of generative AI.

“We will be checking whether businesses have tackled privacy risks before introducing generative AI – and taking action where there is a risk of harm to people through poor use of their data. There can be no excuse for ignoring risks to people’s rights and freedoms before rollout,” explains Almond.

“Businesses need to show us how they’ve addressed the risks that occur in their context – even if the underlying technology is the same. An AI-backed chat function helping customers at a cinema raises different questions compared with one for a sexual health clinic, for instance.”

The ICO is committed to supporting UK businesses in their development and adoption of new technologies that prioritise privacy.

The recently updated Guidance on AI and Data Protection serves as a comprehensive resource for developers and users of generative AI, providing a roadmap for data protection compliance. Additionally, the ICO offers a risk toolkit to assist organisations in identifying and mitigating data protection risks associated with generative AI.

For innovators facing novel data protection challenges, the ICO provides advice through its Regulatory Sandbox and Innovation Advice service. To enhance their support, the ICO is piloting a Multi-Agency Advice Service in collaboration with the Digital Regulation Cooperation Forum, aiming to provide comprehensive guidance from multiple regulatory bodies to digital innovators.

While generative AI offers tremendous opportunities for businesses, the ICO emphasises the need to address privacy risks before widespread adoption. By understanding the implications, mitigating risks, and complying with data protection laws, organisations can ensure the responsible and ethical implementation of generative AI technologies.

(Image Credit: ICO)

Related: UK will host global AI summit to address potential risks

Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The event is co-located with Digital Transformation Week.

The post Stephen Almond, ICO: Prioritise privacy when adopting generative AI appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2023/06/15/stephen-almond-ico-prioritise-privacy-adopting-generative-ai/feed/ 0
Infocepts CEO Shashank Garg on the D&A market shifts and impact of AI on data analytics https://www.artificialintelligence-news.com/2023/05/09/infocepts-ceo-shashank-garg-on-the-da-market-shifts-and-impact-of-ai-on-data-analytics/ https://www.artificialintelligence-news.com/2023/05/09/infocepts-ceo-shashank-garg-on-the-da-market-shifts-and-impact-of-ai-on-data-analytics/#respond Tue, 09 May 2023 14:11:35 +0000 https://www.artificialintelligence-news.com/?p=13027 Could you tell us a little bit your company, Infocepts? On a mission to bridge the gap between the worlds of business and analytics, Infocepts was founded in 2004 by me and Rohit Bhayana, both with more than 20 years of experience in the Data and Analytics (D&A) industry. People often use the term business... Read more »

The post Infocepts CEO Shashank Garg on the D&A market shifts and impact of AI on data analytics appeared first on AI News.

]]>
Could you tell us a little bit your company, Infocepts?

On a mission to bridge the gap between the worlds of business and analytics, Infocepts was founded in 2004 by me and Rohit Bhayana, both with more than 20 years of experience in the Data and Analytics (D&A) industry. People often use the term business analytics as one phrase, but if you have worked in the industry for a long time and if you talk to a lot of people, you’ll realise just how big the gap is.

And that’s Infocepts’ focus. We are an end-end D&A solutions provider with an increasing focus on AI and our solutions combine our processes, expertise, proprietary technologies all packaged together to deliver predictable outcomes to our clients. We work for marquee enterprise clients across industries. Infocepts has the highest overall ranking on Gartner peer reviews amongst our competitors and we are a Great Place to Work certified firm. So, we’re very proud that our clients and our people love us.

The data & analytics technology market is evolving very fast. What’s your view of it?

I love being in the data industry and a large reason is the pace at which it moves. In less than 10 years we have gone from about 60-70 technologies to 1,400+ and growing. But the problems have not grown 20X. That means, we now have multiple ways to solve the same problem.

Similarly, on the buyer side, we have seen a huge change in the buyer persona. Today, I don’t know of any business leader who is not a data leader. Today’s business leaders were born in the digital era and are super comfortable not just with insights but with the lowest level data. They know the modeling methods and have an intuitive sense of where AI can help. Most executives in today’s world also have a deeper understanding about what data quality means, its importance, and how it will change the game in the long run.

So, we are seeing a big change both on the supply and demand side.

What are some of the key challenges you see in front of business & data leaders focused on data-driven transformation?

The gap between the worlds of business and analytics is a very, very real one. I would like to quote this leadership survey which highlights this contradiction. Talking about D&A initiatives which are adding value – 90% of data leaders believe their company’s data products provide real business value, but only 39% of business leaders feel so. That’s a huge gap. Ten years ago, the number would have been lower, but the gap was still the same. This is not a technology issue. What it tells us is that the most common roadblocks to the success of D&A initiatives are all human-related challenges like skills shortages, lack of business engagement, difficulty accepting change and poor data literacy throughout the organisation

We all know the power of data and we spoke about business leaders being data leaders, but there are still people in organisations who need to change. Data leaders are still not speaking the language of business and are under intense pressure to demonstrate the value of D&A initiatives to business executives.

The pace at which technologies have changed and evolved is the pace at which you will see businesses evolving due to human-centric changes. The next five years look very transformational and positive for the industry.

Can you also talk about some of the opportunities you see in front of the D&A leaders?

The first big opportunity is to improve productivity to counter the economic uncertainty. Companies are facing financial crunch because of on-going economic uncertainty including the very real possibility of a recession in the next 12-18 months. Data shows that there are companies that come out strong after a recession, with twice the industry averages in revenue & profits. These are the companies who are proactive in preparing & executing against multiple scenarios backed by insights. They redeploy their resources towards the highest value activities in their strategy and manage other costs. Companies need to stop paying for legacy technologies and fix their broken managed services model. To keep up with the changing technology landscape, it’s important to choose on-demand talent. 

Secondly, companies and people should innovate with data and become data fluent. Many organisations have invested in specialised teams for delivering data. But the real value from data comes only when your employees use it. Data fluency is an organisational capability that enables every employee to understand, access, and apply data as fluently as one can speak in their language. With more fluent people in an organisation, productivity increases, turnover reduces, and innovation thrives without relying only on specialised teams. Companies should assess their organisational fluency and consider establishing a data concierge. It’s like a ten layered structure instead of a very big team. A concierge which can help you become more fluent and introduce interventions across the board to strengthen access, democratise data, increase trust adoption.

Lastly, there’s a huge opportunity to reimagine how we bring value to the business using data. Salesforce and Amazon pioneered commercially successful IaaS, PaaS, and SaaS models in cloud computing that gradually shifted significant portions of responsibility for bringing value from the client to the service provider. The benefits of agility, elasticity, economies of scale, and reach are well known. Similarly, data & analytics technologies need to go through a similar paradigm and go one step further towards productised services, what we call at Infocepts – Solutions as a Service!

Can you talk more about your Solutions as a Service approach?

What we mean by Solutions as a Service is a combination of products, problem solving & expertise together in one easy to use solution. This approach is inevitable given the sheer pace at which technology is evolving. This new category requires a shift in thinking and will give you a source of advantage like how the early cloud adopters received during the last decade. Infocepts offers many domain-driven as-a-service solutions in this category such as e360 for people analytics, AutoMate for business automations and DiscoverYai (also known as AI-as-a-Service) for realising the benefits of AI.

There is a lot of buzz around AI. What does AI mean for the world of D&A and how real is the power of AI?

Oh! It’s very real. In the traditional BI paradigm, business users struggled to get access to their data, but even if they crossed that hurdle, they still needed to know what questions to ask. AI can be an accelerator and educator by helping business folks know what to look for in their data in the first place.

AI-driven insights can help uncover the “why” in your data. For example, augmented analytics can help you discover why sales are increasing and why market penetration varies from city to city, guiding you towards hidden insights for which you didn’t know where to look.

Another example is the use of chatbots or NLP driven generative AI solutions that can understand and translate queries such as, “What are sales for each category and region?” Thanks to modern computing and programming techniques combined with the power of AI, these solutions can run thousands of analyses on billions of rows in seconds, use auto ML capabilities to identify best fit models & produce insights to answer such business questions.

Then, through natural language generation, the system can present the AI-driven insights to the user in an intuitive fashion – including results to questions that the user might not have thought to ask. With user feedback and machine learning, the AI can become more intelligent about which insights are most useful

In addition to insights generation, AI can also play a role in data management & engineering by automating data discovery, data classification, metadata enhancements, data lifecycle governance, data anonymisation and more.

On the data infra side, models trained in machine learning can be used to solve classification, prediction, and control problems to automate activities & add or augment capabilities such as – predictive capacity & availability monitoring, intelligent alert escalation & routing, anomaly detection, ChatOps, root cause identification and more. 

Where can AI create immediate impact for businesses? Can you share some examples?

AI is an enabler for data and analytics as against being a technology vertical by itself. As an example, let’s look at the retail industry – use cases like store activity monitoring, order management, fraud/threat detection, assortment management have existed for a while now. With AI, you can deliver them way faster.

In media, some of the use cases that we are helping our clients with are around demand prediction, content personalisation, content recommendation, synthetic content generation – both text & multimedia. AI also has vast applications in banking. We again have fraud detection, and coupled with automation, now it’s not just detection but you can also put controls in real time to stop fraud.

We have also implemented AI use-cases within Infocepts. We leverage AI to increase our productivity & employee engagement. Our HR team launched ‘Amber’, an AI bot that redefines employee experience. We use AI assistants to record, transcribe and track actions from voice conversations. Our marketing & comms teams use generative AI tools for content generation.

The advancement we have seen in the tech space in the last few years is what you will see in the next 3 to 4 years on the people side. And I think AI assisted tech processes and solutions will play a huge role there.

What advice would you give business leaders who are looking to get started with AI?                             

Embrace an AI-first mindset! Instead of the traditional approach of tackling complex business problems by sifting through data and wrestling with analysis for months before you see any results, it’s important to embrace an AI-first mindset to get things done in no time! AI-driven auto-analysis uncovers hidden patterns and trends so analysts can get to “why” faster and help their business users take actions. The auto-analysis gives data teams access to hidden patterns and the dark corners of their data. Let your AI tools do most of the grunt work faster than your traditional approaches. And now with Generative AI technologies bolted on top of these solutions, you can make it conversational using voice or natural language search capabilities.

Solutions like Infocepts DiscoverYai does just this. It gives organisations the opportunity to make smart choices based on data-driven insights. Our process starts by identifying clients’ objectives and then leveraging advanced AI strategies that quickly assess data quality, highlight key relationships in your data, identify drivers impacting your results and surface useful recommendations as well as provide an impact analysis resulting in actionable recommendations that have maximum impact potential – all delivered through an effective combination of tried-and-tested practices along with cutting edge AI driven processes!

Secondly, to gain the most from AI-driven insights, you’ll need to be ready for a little experimentation. Embrace getting it wrong and use those discoveries as learning opportunities. Hackathons/innovation contests are a great way to generate quick ideas, fail fast and succeed faster.

It’s also essential that your team can confidently understand data; this enables them to recognise useful actions generated by artificial intelligence without hesitation. So, while you use AI, ensure that it is explainable.

Lastly, help your organisation set up systems which will make sure your AI models don’t become obsolete in an ever-evolving landscape – keep upping their training so they remain ready to take on even harder challenges!

About Shashank Garg

Shashank Garg is the co-founder and CEO of Infocepts, a global leader in the Data & AI solutions. As a trusted advisor to CXOs of several Fortune 500 companies, Shashank has helped leaders across the globe to disrupt and transform their businesses using the power of Data Analytics and AI. Learn more about him on LinkedIn.

About Infocepts

Infocepts enables improved business results through more effective use of data, AI & user-friendly analytics. Infocepts partners with its clients to resolve the most common & complex challenges standing in their way of using data to strengthen business decisions. To learn more, visit: infocepts.com or follow Infocepts on LinkedIn.

Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The event is co-located with Digital Transformation Week.

Explore other upcoming enterprise technology events and webinars powered by TechForge here.

The post Infocepts CEO Shashank Garg on the D&A market shifts and impact of AI on data analytics appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2023/05/09/infocepts-ceo-shashank-garg-on-the-da-market-shifts-and-impact-of-ai-on-data-analytics/feed/ 0
Devang Sachdev, Snorkel AI: On easing the laborious process of labelling data https://www.artificialintelligence-news.com/2022/09/30/devang-sachdev-snorkel-ai-on-easing-the-laborious-process-of-labelling-data/ https://www.artificialintelligence-news.com/2022/09/30/devang-sachdev-snorkel-ai-on-easing-the-laborious-process-of-labelling-data/#respond Fri, 30 Sep 2022 07:52:51 +0000 https://www.artificialintelligence-news.com/?p=12318 Correctly labelling training data for AI models is vital to avoid serious problems, as is using sufficiently large datasets. However, manually labelling massive amounts of data is time-consuming and laborious. Using pre-labelled datasets can be problematic, as evidenced by MIT having to pull its 80 Million Tiny Images datasets. For those unaware, the popular dataset... Read more »

The post Devang Sachdev, Snorkel AI: On easing the laborious process of labelling data appeared first on AI News.

]]>
Correctly labelling training data for AI models is vital to avoid serious problems, as is using sufficiently large datasets. However, manually labelling massive amounts of data is time-consuming and laborious.

Using pre-labelled datasets can be problematic, as evidenced by MIT having to pull its 80 Million Tiny Images datasets. For those unaware, the popular dataset was found to contain thousands of racist and misogynistic labels that could have been used to train AI models.

AI News caught up with Devang Sachdev, VP of Marketing at Snorkel AI, to find out how the company is easing the laborious process of labelling data in a safe and effective way.

AI News: How is Snorkel helping to ease the laborious process of labelling data?

Devang Sachdev: Snorkel Flow changes the paradigm of training data labelling from the traditional manual process—which is slow, expensive, and unadaptable—to a programmatic process that we’ve proven accelerates training data creation 10x-100x.

Users are able to capture their knowledge and existing resources (both internal, e.g., ontologies and external, e.g., foundation models) as labelling functions, which are applied to training data at scale. 

Unlike a rules-based approach, these labelling functions can be imprecise, lack coverage, and conflict with each other. Snorkel Flow uses theoretically grounded weak supervision techniques to intelligently combine the labelling functions to auto-label your training data set en-masse using an optimal Snorkel Flow label model. 

Using this initial training data set, users train a larger machine learning model of their choice (with the click of a button from our ‘Model Zoo’) in order to:

  1. Generalise beyond the output of the label model.
  2. Generate model-guided error analysis to know exactly where the model is confused and how to iterate. This includes auto-generated suggestions, as well as analysis tools to explore and tag data to identify what labelling functions to edit or add. 

This rapid, iterative, and adaptable process becomes much more like software development rather than a tedious, manual process that cannot scale. And much like software development, it allows users to inspect and adapt the code that produced training data labels.

AN: Are there dangers to implementing too much automation in the labelling process?

DS: The labelling process can inherently introduce dangers simply for the fact that as humans, we’re fallible. Human labellers can be fatigued, make mistakes, or have a conscious or unconscious bias which they encode into the model via their manual labels.

When mistakes or biases occur—and they will—the danger is the model or downstream application essentially amplifies the isolated label. These amplifications can lead to consequential impacts at scale. For example, inequities in lending, discrimination in hiring, missed diagnoses for patients, and more. Automation can help.

In addition to these dangers—which have major downstream consequences—there are also more practical risks of attempting to automate too much or taking the human out of the loop of training data development.

Training data is how humans encode their expertise to machine learning models. While there are some cases where specialised expertise isn’t required to label data, in most enterprise settings, there is. For this training data to be effective, it needs to capture the fullness of subject matter experts’ knowledge and the diverse resources they rely on to make a decision on any given datapoint.

However, as we have all experienced, having highly in-demand experts label data manually one-by-one simply isn’t scalable. It also leaves an enormous amount of value on the table by losing the knowledge behind each manual label. We must take a programmatic approach to data labelling and engage in data-centric, rather than model-centric, AI development workflows. 

Here’s what this entails: 

  • Elevating how domain experts label training data from tediously labelling one-by-one to encoding their expertise—the rationale behind what would be their labelling decisions—in a way that can be applied at scale. 
  • Using weak supervision to intelligently auto-label at scale—this is not auto-magic, of course; it’s an inherently transparent, theoretically grounded approach. Every training data label that’s applied in this step can be inspected to understand why it was labelled as it was. 
  • Bringing experts into the core AI development loop to assist with iteration and troubleshooting. Using streamlined workflows within the Snorkel Flow platform, data scientists—as subject matter experts—are able to collaborate to identify the root cause of error modes and how to correct them by making simple labelling function updates, additions, or, at times, correcting ground truth or “gold standard” labels that error analysis reveals to be wrong.

AN: How easy is it to identify and update labels based on real-world changes?

DS: A fundamental value of Snorkel Flow’s data-centric approach to AI development is adaptability. We all know that real-world changes are inevitable, whether that’s production data drift or business goals that evolve. Because Snorkel Flow uses programmatic labelling, it’s extremely efficient to respond to these changes.

In the traditional paradigm, if the business comes to you with a change in objectives—say, they were classifying documents three ways but now need a 10-way schema, you’d effectively need to relabel your training data set (often thousands or hundreds of thousands of data points) from scratch. This would mean weeks or months of work before you could deliver on the new objective. 

In contrast, with Snorkel Flow, updating the schema is as simple as writing a few additional labelling functions to cover the new classes and applying weak supervision to combine all of your labelling functions and retrain your model. 

To identify data drift in production, you can rely on your monitoring system or use Snorkel Flow’s production APIs to bring live data back into the platform and see how your model performs against real-world data.

As you spot performance degradation, you’re able to follow the same workflow: using error analysis to understand patterns, apply auto-suggested actions, and iterate in collaboration with your subject matter experts to refine and add labelling functions. 

AN: MIT was forced to pull its ‘80 Million Tiny Images’ dataset after it was found to contain racist and misogynistic labels due to its use of an “automated data collection procedure” based on WordNet. How is Snorkel ensuring that it avoids this labelling problem that is leading to harmful biases in AI systems?

DS: Bias can start anywhere in the system – pre-processing, post-processing, with task design, with modelling choices, etc. And in particular issues with labelled training data.

To understand underlying bias, it is important to understand the rationale used by labellers. This is impractical when every datapoint is hand labelled and the logic behind labelling it one way or another is not captured. Moreover, information about label author and dataset versioning is rarely available. Often labelling is outsourced or in-house labellers have moved on to other projects or organizations. 

Snorkel AI’s programmatic labelling approach helps discover, manage, and mitigate bias. Instead of discarding the rationale behind each manually labelled datapoint, Snorkel Flow, our data-centric AI platform, captures the labellers’ (subject matter experts, data scientists, and others) knowledge as a labelling function and generates probabilistic labels using theoretical grounded algorithms encoded in a novel label model.

With Snorkel Flow, users can understand exactly why a certain datapoint was labelled the way it is. This process, along with label function and label dataset versioning, allows users to audit, interpret, and even explain model behaviours. This shift from manual to programmatic labelling is key to managing bias.

AN: A group led by Snorkel researcher Stephen Bach recently had their paper on Zero-Shot Learning with Common Sense Knowledge Graphs (ZSL-KG) published. I’d direct readers to the paper for the full details, but can you give us a brief overview of what it is and how it improves over existing WordNet-based methods?

DS: ZSL-KG improves graph-based zero-shot learning in two ways: richer models and richer data. On the modelling side, ZSL-KG is based on a new type of graph neural network called a transformer graph convolutional network (TrGCN).

Many graph neural networks learn to represent nodes in a graph through linear combinations of neighbouring representations, which is limiting. TrGCN uses small transformers at each node to combine neighbourhood representations in more complex ways.

On the data side, ZSL-KG uses common sense knowledge graphs, which use natural language and graph structures to make explicit many types of relationships among concepts. They are much richer than the typical ImageNet subtype hierarchy.

AN: Gartner designated Snorkel a ‘Cool Vendor’ in its 2022 AI Core Technologies report. What do you think makes you stand out from the competition?

DS: Data labelling is one of the biggest challenges for enterprise AI. Most organisations realise that current approaches are unscalable and often ridden with quality, explainability, and adaptability issues. Snorkel AI not only provides a solution for automating data labelling but also uniquely offers an AI development platform to adopt a data-centric approach and leverage knowledge resources including subject matter experts and existing systems.

In addition to the technology, Snorkel AI brings together 7+ years of R&D (which began at the Stanford AI Lab) and a highly-talented team of machine learning engineers, success managers, and researchers to successfully assist and advise customer development as well as bring new innovations to market.

Snorkel Flow unifies all the necessary components of a programmatic, data-centric AI development workflow—training data creation/management, model iteration, error analysis tooling, and data/application export or deployment—while also being completely interoperable at each stage via a Python SDK and a range of other connectors.

This unified platform also provides an intuitive interface and streamlined workflow for critical collaboration between SME annotators, data scientists, and other roles, to accelerate AI development. It allows data science and ML teams to iterate on both data and models within a single platform and use insights from one to guide the development of the other, leading to rapid development cycles.

The Snorkel AI team will be sharing their invaluable insights at this year’s AI & Big Data Expo North America. Find out more here and swing by Snorkel’s booth at stand #52.

The post Devang Sachdev, Snorkel AI: On easing the laborious process of labelling data appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2022/09/30/devang-sachdev-snorkel-ai-on-easing-the-laborious-process-of-labelling-data/feed/ 0
Ash Damle, TMDC: Data-based business decisions in real-time https://www.artificialintelligence-news.com/2022/09/22/ash-damle-tmdc-data-based-business-decisions-in-real-time/ https://www.artificialintelligence-news.com/2022/09/22/ash-damle-tmdc-data-based-business-decisions-in-real-time/#respond Thu, 22 Sep 2022 12:53:12 +0000 https://www.artificialintelligence-news.com/?p=12267 Ash Damle, Head of AI and Data Science at TMDC, explains how the company is humanising and democratising data access. AI News: The Modern Data Company (TMDC) aims to “democratise” data access. What are the benefits to enterprises?  Ash Damle: Modern companies are data companies. When a data company’s best asset, its data, is only... Read more »

The post Ash Damle, TMDC: Data-based business decisions in real-time appeared first on AI News.

]]>
Ash Damle, Head of AI and Data Science at TMDC, explains how the company is humanising and democratising data access.

AI News: The Modern Data Company (TMDC) aims to “democratise” data access. What are the benefits to enterprises? 

Ash Damle: Modern companies are data companies. When a data company’s best asset, its data, is only accessible by a handful of individuals, then the company is only scratching the surface of what data can do.

Democratisation of data enables every individual in the company to better perform, innovate, and meet business goals. Modern offers enterprises the ability to put data to work — that requires data to be available and to be trusted. 

AN: Can you still apply different levels of access to data based on individuals/teams? 

AD: You can absolutely still apply different levels of access to data within an organisation. In fact, our approach to governance is a key factor in enabling unprecedented levels of data access, transparency, and usability.

Our ABAC approach provides granular governance controls so that admins can open data to flow to stakeholders without risking privacy or security loopholes. Users can search for and see what data is available for use, while stewards can see who is using data, when, and why.

Regardless of business size or industry, it is fully scalable and allows the organisation to apply compliance and governance rules to all data systematically. This is an entirely new way to approach governance. 

AN: What features are in place to ensure compliance with data regulations? 

AD: Modern gives companies the flexibility to define and apply governance rules at the most granular levels. Our approach also enables admins and decision-makers to view their data ecosystem as a whole for critical governance questions such as: 

  • Who is using data and how are they using it? 
  • Where is data located and stored? 
  • Which business and risk processes does data impact? 
  • What dependencies exist downstream? 

AN: Another key goal of Modern is to “humanise” data. What does that mean in practice? 

AD: Being human involves intelligence and the ability to use that intelligence to inform and formulate dialog. DataOS gives data an organised “voice,” enabling users to trust data to inform their decision-making. It acts as a data engineering partner, allowing users to have a real dialogue with data. 

AN: What are some of the other key problems with traditional data platforms that your solution, DataOS, aims to fix? 

AD: Most data solutions look at a database like it’s just a box of data. Most also operate within a data silo, which may help solve one problem but it can’t serve as an end solution.

The challenge for enterprises is they don’t exist on just one database. DataOS accounts for that, offering a unified source of truth and then empowering users to easily act on the data — no matter the source — with outcome-based data engineering. A user can choose the outcome they need and DataOS will build the right process for them while ensuring that the process is compliant with all security and governance policies.  

AN: How do you ensure your platform is accessible for all employees regardless of their technical skills or background? 

AD: DataOS allows data access and use for individuals according to granular rules set by the organisation. How the company manages access often depends on particular roles and responsibilities, as well as their in-house approach to security.  

AN: What data formats are supported by DataOS? 

AD: DataOS deals with heterogeneous formats, such as Sequel, CSBS, Excel files, and many, many more. It also extracts data and allows enterprises to do more intelligent things with imagery, access essential data easily, and see metadata so they can leverage all data assets across the board. 

AN: Bad data is worse than no data. How do you check and report the quality of data? 

AD: With DataOS, organisations define their own rules for what to do with data before making it available. DataOS then automates enforcement of those rules to ensure they’re adhering to the right distributions and applying necessary quality checks. DataOS ensures that you’re always getting the best data possible and that you are always alerted of any data quality issues.

The Modern Data Company (TMDC) is sponsoring this year’s AI & Big Data Expo North America. Swing by their booth at stand #66 to learn more.

Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London.

The post Ash Damle, TMDC: Data-based business decisions in real-time appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2022/09/22/ash-damle-tmdc-data-based-business-decisions-in-real-time/feed/ 0
UK eases data mining laws to support flourishing AI industry https://www.artificialintelligence-news.com/2022/06/29/uk-eases-data-mining-laws-support-flourishing-ai-industry/ https://www.artificialintelligence-news.com/2022/06/29/uk-eases-data-mining-laws-support-flourishing-ai-industry/#respond Wed, 29 Jun 2022 12:21:38 +0000 https://www.artificialintelligence-news.com/?p=12111 The UK is set to ease data mining laws in a move designed to further boost its flourishing AI industry. We all know that data is vital to AI development. Tech giants are in an advantageous position due to either having existing large datasets or the ability to fund/pay for the data required. Most startups... Read more »

The post UK eases data mining laws to support flourishing AI industry appeared first on AI News.

]]>
The UK is set to ease data mining laws in a move designed to further boost its flourishing AI industry.

We all know that data is vital to AI development. Tech giants are in an advantageous position due to either having existing large datasets or the ability to fund/pay for the data required. Most startups rely on mining data to get started.

Europe has notoriously strict data laws. Advocates of regulations like GDPR believe they’re necessary to protect consumers, while critics argue it drives innovation, investment, and jobs out of the Eurozone to countries like the USA and China.

“You’ve got your Silicon Valley startup that can access large amounts of money from investors, access specialist knowledge in the field, and will not be fighting with one arm tied behind its back like a competitor in Europe,” explained Peter Wright, Solicitor and MD of Digital Law UK.

An announcement this week sets out how the UK intends to support its National AI Strategy from an intellectual property standpoint.

The announcement comes via the country’s Intellectual Property Office (IPO) and follows a two-month cross-industry consultation period with individuals, large and small businesses, and a range of organisations.

Text and data mining

Text and data mining (TDM) allows researchers to copy and harness disparate datasets for their algorithms. As part of the announcement, the UK says it will now allow TDM “for any purpose,” which provides much greater flexibility than an exception made in 2014 that allowed AI researchers to use such TDM for non-commercial purposes.

In stark contrast, the EU’s Directive on Copyright in the Digital Single Market offers a TDM exception only for scientific research.

“These changes make the most of the greater flexibilities following Brexit. They will help make the UK more competitive as a location for firms doing data mining,” wrote the IPO in the announcement.

AIs still can’t be inventors

Elsewhere, the UK is more or less sticking to its previous stances—including that AI systems cannot be credited as inventors in patents.

The most high-profile case on the subject is of US-based Dr Stephen Thaler, the founder of Imagination Engines. Dr Thaler has been leading the fight to give credit to machines for their creations.

An AI device created by Dr Thaler, DABUS, was used to invent an emergency warning light, a food container that improves grip and heat transfer, and more.

In August 2021, a federal court in Australia ruled that AI systems can be credited as inventors under patent law after Ryan Abbott, a professor at the University of Surrey, filed applications in the country on behalf of Dr Thaler. Similar applications were also filed in the UK, US, and New Zealand.

The UK’s IPO rejected the applications at the time, claiming that – under the country’s Patents Act – only humans can be credited as inventors. Subsequent appeals were also rejected.

“A patent is a statutory right and it can only be granted to a person,” explained Lady Justice Liang. “Only a person can have rights. A machine cannot.”

In the IPO’s latest announcement, the body reiterates: ”For AI-devised inventions, we plan no change to UK patent law now. Most respondents felt that AI is not yet advanced enough to invent without human intervention.”

However, the IPO highlights the UK is one of only a handful of countries that protects computer-generated works. Any person who makes “the arrangements necessary for the creation of the [computer-generated] work” will have the rights for 50 years from when it was made.

Supporting a flourishing AI industry

Despite being subject to strict data regulations, the UK has become Europe’s hub for AI with pioneers like DeepMind, Wayve, Graphcore, Oxbotica, and BenevolentAI. The country’s world-class universities churn out in-demand AI talent and tech investments more than double other European countries.

(Credit: Atomico)

More generally, the UK is regularly considered one of the best places in the world to set up a business. All eyes are on how the country will use its post-Brexit freedoms to diverge from EU rules to further boost its industries.

“The UK already punches above its weight internationally and we are ranked third in the world behind the USA and China in the list of top countries for AI,” said Chris Philp, DCMS Minister.

“We’re laying the foundations for the next ten years’ growth with a strategy to help us seize the potential of artificial intelligence and play a leading role in shaping the way the world governs it.”

There will undoubtedly be debates over the decisions made by the UK to boost its AI industry, especially regarding TDM, but the policies announced so far will support entrepreneurship and the country’s attractiveness for relevant investments.

(Photo by Chris Robert on Unsplash)

Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The event is also co-located with the Cyber Security & Cloud Expo.

Explore other upcoming enterprise technology events and webinars powered by TechForge here.

The post UK eases data mining laws to support flourishing AI industry appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2022/06/29/uk-eases-data-mining-laws-support-flourishing-ai-industry/feed/ 0
How sports clubs achieve a slam dunk in loyalty with data https://www.artificialintelligence-news.com/2022/06/06/how-sports-clubs-achieve-a-slam-dunk-in-loyalty-with-data/ https://www.artificialintelligence-news.com/2022/06/06/how-sports-clubs-achieve-a-slam-dunk-in-loyalty-with-data/#respond Mon, 06 Jun 2022 14:58:54 +0000 https://www.artificialintelligence-news.com/?p=12040 The way we watch, engage, and interact with our favourite sports clubs is undergoing a seismic shift. Recent UK research suggests that data will now have a more important role in fan engagement than ever before. In this article, we take a closer look at what this means for sports clubs serious about future-proofing their... Read more »

The post How sports clubs achieve a slam dunk in loyalty with data appeared first on AI News.

]]>
The way we watch, engage, and interact with our favourite sports clubs is undergoing a seismic shift. Recent UK research suggests that data will now have a more important role in fan engagement than ever before. In this article, we take a closer look at what this means for sports clubs serious about future-proofing their strategy to attract and retain loyal fans.

Matchday may be the ‘pinnacle’ for sports fans, but for sports clubs, the real battleground is that period between the ‘live-action’ and the ‘actual creation’ of a deep and enriching relationship with their fan base. As competition is heating up to win the hearts and minds of fans through relevant marketing and compete through the ‘noise’, the pressure is on sports clubs and associations to become more innovative. 

But despite the vast data sets at the fingertips of sports marketers, there is much room for improvement when it comes to delivering relevant, personalised communications or messages, experiences, or content to fans in real-time.

Creating a relationship beyond matchday 

To create a relationship that goes beyond game day, sports brands must connect with fans on the right channels at the right time. With zero-party data, or data willingly shared by fans, it’s possible to know what makes fans tick as well as the best ways to engage with them.

How do sports clubs encourage fans to share more of their personal information? You know, the “good stuff” that goes beyond names and email addresses to who they’re attending matches with and if they also watch the game at home, for example. It’s all about the value exchange. And the value exchange begins with data. 

Revolutionising engagement with data 

Data allows sports clubs to move to a more enriched understanding of who their fans are. It gives them insight into their motivations and preferences. The biggest success of the sports clubs we work with is, with Cheetah experiences, fans willingly share their information.

To improve every fan’s experience along their digital journey, it’s vital that the communications they receive from the club are tailored. They have to be personalised to their particular wants and desires. That’s where the data comes in. While content is perhaps the “shiniest” element of the marketing mix; it’s the data and the insights that really make a difference. These elements provide clubs with all the information they need to create bespoke communications, helping to foster that one-to-one relationship with fans.  

Data is also key in creating effective partnerships with brands that want to sponsor sports clubs. Once clubs know more about the fans, their behaviours, and motivations at a country-level, the value of sponsorships can be greatly enriched. That’s because partners are looking for clubs with an engaged fan base, and the only way to get an engaged fan base is to know and create meaningful relationships with them. This in turn, allows clubs to have successful commercial partnerships, which drives revenue into the club – revenue that allows them to invest back into the team and secure top-end spots in competition.

Turning challenges into opportunities

Not too long ago, the customer experience began and ended on matchday. Today, however, that’s simply not the case. In this new digital era, passionate fans are engaging with clubs on different platforms, 24/7. There’s no winter break, pre-season, or rest days for fan engagement – it’s constantly game-on.

Even when the pandemic toppled the athletic landscape, seeing sports ground to a halt with no indication of coming back again; it wasn’t the time to stop engaging fans. Instead, it was more vital than ever to keep their passion alive. Developing new ways to build off of a captive audience who was still hungry for sports was the first order of business for sports clubs and absolutely key to their survival.

But first, these clubs had to turn their unknown audience into a known audience. Digital channels and engagement are vital to helping clubs connect with their fans. It allows them to achieve deep, long-lasting, and meaningful relationships. Once fans feel connected to their clubs, their love grows and that creates a foundation that supports revenue creation and successful commercial partnerships.

However, this is nearly impossible to do without insights from data. Many clubs still have their data in silos where the ticketing team only sees their data, the hospitality team only sees their data, and so on. Getting away from silos and gaining a unified understanding of fans – who they are, what life stage they’re in, and what they want from the club – from top to bottom throughout the organisation is vital to revolutionising engagement.

Take a look at the Barcelona Spotify deal. If Barcelona truly knew its fans better, the deal could have been worth a lot more. However, since they didn’t, they were only able to target about 1% of their fan base — the rest were essentially invisible to them. 

The key takeaway from Barcelona’s unfortunate situation is just how crucial it is to get your fans to share information and permissions with you willingly. It’s absolutely essential in marketing to them more effectively.

And, of course, we can’t talk about effective marketing in today’s world without bringing up the death of the cookie. Never has there been a greater need to get fans to share their personal and preference data willingly than now. Unfortunately, it’s not an “ask and you shall receive” kind of arrangement. Fans are increasingly weary when it comes to handing over their personal information. That’s why sports clubs need to offer an enticing value exchange.

Leverage data for a game-winning loyalty strategy

When it comes to the value exchange, savvy sports clubs know that it doesn’t always have to be a discount or a red-letter prize that entices fans to share their details. Access to exclusive content and community initiatives can also be the catalyst for zero-party data collection.

According to Cheetah Digital’s report for sports teams and associations, 55% of fans will share psychographic data points like purchase motivations and product feedback with sports brands. Even more, half of all fans surveyed said they desire incentives like coupons, loyalty points, or exclusive access in return for their data. 

With Cheetah Digital’s Customer Engagement Suite, there’s an entire platform that makes it easy to build the most relevant, integrated, and profitable customer experiences. Take a look:

  • Cheetah Engagement Data Platform: This foundational data layer and personalisation engine enables marketers to drive data from intelligent insights to action at speed and scale.
  • Cheetah Experiences: Interactive digital acquisition experiences are delivered to delight customers, collect first- and zero-party data, and secure valuable permissions needed to execute compliant and successful marketing campaigns.
  • Cheetah Messaging: Enables marketers to create and deliver relevant, personalised marketing campaigns across all channels and touchpoints.
  • Cheetah Loyalty: Provides marketers with the tools to create and deliver unique loyalty programs that generate an emotional connection between brands and their customers.
  • Cheetah Personalisation: Enables marketers to leverage the power of machine learning and automated journeys to connect with customers on a one-to-one basis.

Acquisition helps to turn an “unknown” audience into a “known” audience. Why is this important? Well, with “known” fans come a lot of potential in the form of direct revenue, partner revenue, and participation.

The sports clubs to watch

Cheetah Digital has partnered with some of the world’s top sports brands and organisations to create and launch an array of successful campaign experiences with ease. Whether to boost match-day excitement, connect with fans, monetise a global audience, or increase content relevancy to reach a specific demographic; sports organisations are using Cheetah Experiences to create impactful digital experiences that drive results.

Below, we look at how Arsenal Football Club (F.C.) and the FA are leveraging a fully-fledged, zero-party data strategy to connect with fans on every digital channel and collect the preference insights and permissions required to drive personalisation initiatives. 

Arsenal F.C.

Arsenal F.C. intelligently uses data to enhance digital engagement amongst one of the largest and most passionate fan bases in the world – it’s estimated to be upwards of 750 million people! The club built out its omnichannel campaign strategies through various technologies with Cheetah Digital being the main platform. That ensures the communication it sends out is relevant to fans and that it’s communicated on the right platforms at the right times in the right tone. 

Adam Rutzler, Senior Campaign and Insight Manager at Arsenal, says the most crucial aspect of his team’s work is ensuring that fans receive the best content that’s most relevant to them. “We work with a magic triangle, the power of three – transactional data, a demographic segmentation, persona-led approach, and behavioural data,” he explains.

“We get a solid understanding of our fans by taking the combination of these three things and hitting the sweet spot in the middle. What are our fans buying, who are they, and how do they engage with our football club – that’s when we really get the power of understanding our fans, what they want from us, and how we can best give that to them.”

For example, Arsenal has found the score predictor game is well received with fans. It encourages them to guess the score of the upcoming match to win a prize. And that prize can be anything from signed shirts to training kits — whatever fans would desire. 

Where Arsenal has noticed the most traction and where it’s getting some real buy-in from fans, however, is in giving away those money-can’t-buy prizes, such as corner flags from matches. Fans are really excited about these types of prizes. That memorabilia from clubs is truly meaningful to fans who are very passionate about their teams.

Therefore, the experiences that we’re offering and serving up on behalf of the clubs that we work with need to be in tune with fans. They need to offer something fun and something that’s on-brand.

Going forward, Adam says he’s excited about all the possibilities data opens up for the club. “What’s exciting about the insights we’re working with right now to continue understanding our global following is the possibility of turning our triangle into a square by adding psychographic data in.

“We want to understand the fans’ attitudes, aspirations, and personalities. That will allow us to find out what motivates them to engage with certain communications of ours. If we understand that, it would provide us with some very powerful insights,” he says.

The Football Association (FA)

The FA has a grand ambition to double its contactable CRM database by 2024. Achieving this will drive direct revenue, boosting sales for the FA directly. It will increase partner revenue, expanding their reach and resonance with partners. And it will also drive participation in the sport at a grassroots level, which is basically the cornerstone of what the FA does.

In terms of value exchange, the club is achieving above-average conversion rates, using a diverse set of tools like team sectors, man-of-the-match polls, and score predictors for upcoming FA Cup competitions. According to Paul Brierley, CRM & Membership Lead at the FA, the reason the FA’s strategy has been so effective boils down to its value proposition and relevance.

“Cheetah experiences, in particular, are helping us to drive an incredibly effective value exchange with fans. The combination of sought-after prizes, relevance and timing of that prize, and a compelling gamification experience is producing a highly successful channel for fan experience and data growth,” he says.

Future success

Going forward, there’s no other way for a sports club to be successful without understanding its fan base. It’s paramount to capture their motivations, intentions, and preferences at scale to provide a truly personalised experience. By leveraging Cheetah Experiences and offering a value exchange, fans will tell all – the products they desire, what they look for in a loyalty program, and what motivates them to engage. And that information translates to a hugely successful club both now and into the future.

Download this campaign guide packed with examples from leading sports brands and associations that are delivering engaging, interactive experiences in return for fans’ opt-ins and preference data, and then using this data to deliver true personalisation.

(Editor’s note: This article is in association with Cheetah Digital)

The post How sports clubs achieve a slam dunk in loyalty with data appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2022/06/06/how-sports-clubs-achieve-a-slam-dunk-in-loyalty-with-data/feed/ 0
US appeals court decides scraping public web data is fine https://www.artificialintelligence-news.com/2022/04/19/us-appeals-court-scraping-public-web-data-fine/ https://www.artificialintelligence-news.com/2022/04/19/us-appeals-court-scraping-public-web-data-fine/#respond Tue, 19 Apr 2022 12:35:56 +0000 https://artificialintelligence-news.com/?p=11890 The US Ninth Circuit Court of Appeals has decided that scraping data from a public website doesn’t violate the Computer Fraud and Abuse Act (CFAA). In 2017, employment analytics firm HiQ filed a lawsuit against LinkedIn’s efforts to block it from scraping data from users’ profiles. The court barred Linkedin from stopping HiQ scraping data... Read more »

The post US appeals court decides scraping public web data is fine appeared first on AI News.

]]>
The US Ninth Circuit Court of Appeals has decided that scraping data from a public website doesn’t violate the Computer Fraud and Abuse Act (CFAA).

In 2017, employment analytics firm HiQ filed a lawsuit against LinkedIn’s efforts to block it from scraping data from users’ profiles.

The court barred Linkedin from stopping HiQ scraping data after deciding the CFAA – which criminalises accessing a protected computer – doesn’t apply due to the information being public.

LinkedIn appealed the case and in 2019 the Ninth Circuit Court sided with HiQ and upheld the original decision.

In March 2020, LinkedIn once again appealed the decision on the basis that implementing technical barriers and sending a cease-and-desist letter is revoking authorisation. Therefore, any subsequent attempts to scrape data are unauthorised and therefore break the CFAA.

“At issue was whether, once hiQ received LinkedIn’s cease-and-desist letter, any further scraping and use of LinkedIn’s data was ‘without authorization’ within the meaning of the CFAA,” reads the filing (PDF).

“The panel concluded that hiQ raised a serious question as to whether the CFAA ‘without authorization’ concept is inapplicable where, as here, prior authorization is not generally required but a particular person—or bot—is refused access.”

The filing highlights several of LinkedIn’s technical measures to protect against data-scraping:

  • Prohibiting search engine crawlers and bots – aside from certain allowed entities, like Google – from accessing LinkedIn’s servers via the website’s standard ‘robots.txt’ file.
  • ‘Quicksand’ system that detects non-human activity indicative of scraping.
  • ‘Sentinel’ system that slows (or blocks) activity from suspicious IP addresses.
  • ‘Org Block’ system that generates a list of known malicious IP addresses linked to large-scale scraping.

Overall, LinkedIn claims to block approximately 95 million automated attempts to scrape data every day.

The appeals court once again ruled in favour of HiQ, upholding the conclusion that “the balance of hardships tips sharply in hiQ’s favor” and the company’s existence would be threatened without having access to LinkedIn’s public data.

“hiQ’s entire business depends on being able to access public LinkedIn member profiles,” hiQ’s CEO argued. “There is no current viable alternative to LinkedIn’s member database to obtain data for hiQ’s Keeper and Skill Mapper services.” 

However, LinkedIn’s petition (PDF) counters that the ruling has wider implications.

“Under the Ninth Circuit’s rule, every company with a public portion of its website that is integral to the operation of its business – from online retailers like Ticketmaster and Amazon to social networking platforms like Twitter – will be exposed to invasive bots deployed by free-riders unless they place those websites entirely behind password barricades,” wrote the company’s attorneys.

“But if that happens, those websites will no longer be indexable by search engines, which will make information less available to discovery by the primary means by which people obtain information on the Internet.”

AI companies that often rely on mass data-scraping will undoubtedly be pleased with the court’s decision.

Clearview AI, for example, has regularly been targeted by authorities and privacy campaigners for scraping billions of images from public websites to power its facial recognition system.

“Common law has never recognised a right to privacy for your face,” Clearview AI lawyer Tor Ekeland once argued.

Clearview AI recently made headlines for offering its services to Ukraine to help the country identify both Ukrainian defenders and Russian assailants who’ve lost their lives in the brutal conflict.

Mass data scraping will remain a controversial subject. Supporters will back the appeal court’s ruling while opponents will join LinkedIn’s attorneys in their concerns about normalising the practice.

(Photo by ThisisEngineering RAEng on Unsplash)

Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London.

Explore other upcoming enterprise technology events and webinars powered by TechForge here.

The post US appeals court decides scraping public web data is fine appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2022/04/19/us-appeals-court-scraping-public-web-data-fine/feed/ 0
Ready to boost your business with analytics? This data expert reveals all https://www.artificialintelligence-news.com/2022/04/08/ready-boost-your-business-analytics-data-expert-reveals-all/ https://www.artificialintelligence-news.com/2022/04/08/ready-boost-your-business-analytics-data-expert-reveals-all/#respond Fri, 08 Apr 2022 15:06:18 +0000 https://artificialintelligence-news.com/?p=11864 By itself, data is like a bicycle with no wheels. It can’t go anywhere. That’s where the power of analytics comes in. Similar to the wheels of a bike, analytics powers data to reveal meaningful trends and insights, enabling organisations to make key business decisions.  As data sets increase and become more complicated, manual analytics... Read more »

The post Ready to boost your business with analytics? This data expert reveals all appeared first on AI News.

]]>
By itself, data is like a bicycle with no wheels. It can’t go anywhere. That’s where the power of analytics comes in. Similar to the wheels of a bike, analytics powers data to reveal meaningful trends and insights, enabling organisations to make key business decisions. 

As data sets increase and become more complicated, manual analytics processes become less feasible. And as data gathering evolves, analytics has to keep up. That’s where advanced analytics joins the party. With the use of machine learning, data mining, and advanced modeling techniques, advanced analytics can turn an organisation’s vast amount of data into increasingly accurate predictions. 

Even more, advanced analytics can help identify trends and predict the best steps to yield positive outcomes. This helps organisations forge a path of strong and sustainable growth.

Adopting advanced analytics: a story of resistance and confusion

Despite the tremendous value-add advanced analytics can provide, it’s still often met with confusion and resistance in organisations. For example, in the UK, recent research commissioned by Exasol, the analytics database, found that 63% of UK data decision-makers experience resistance from employees in adopting data-driven methods. They attribute this resistance to anxiety over job redundancy, a lack of understanding, and a lack of education on the positive impact of data analytics. 

The report further reveals that part of the problem of data acceptance lies in 40% of respondents admitting that data strategy is not being driven by anyone in the organisation. With advanced analytics becoming mission-critical to all businesses, organisations must implement a clear data-driven strategy and ensure buy-in from all employees and stockholders.

Fundamental misunderstandings sit at the heart of the user-adoption dilemma. Business Intelligence (BI) developers have built powerful tools, but they weren’t necessarily appropriate for the mindset and skillset of the end-user. They relied too much on the user being an analyst at heart.

The idea then became that if the tools were comparatively easy to use, all knowledge workers could turn into analysts. But that wasn’t necessarily the case either. Analytics has to be translated into action. And BI tools have to work for the end-user in order for the tools to become an integrated and natural part of staff workflow.

Infused, advanced analytics: a powerful step towards a data-first culture

Infused analytics (also called embedded analytics) is vital to making data abundantly available in a format that suits the needs of the user. It puts actionable intelligence from analysed data into every workflow, process, business application, and even internally developed products. 

This is the power of Sisense. With the Sisense platform, users enjoy fully customised experiences, driven by APIs that deliver the right information, at the right time and place, and in a way that makes sense to the user. This speeds up time to action, simplifies decision-making, and increases productivity. Because infused analytics sits within the existing technology stack a team already uses, they don’t have to learn a new tool or switch between platforms in order to gain insights. 

For organisations that desire to create a data-first culture, it’s imperative to take the user experience seriously. A global study from Exasol, the analytics database, found that 65% of data teams have experienced employee resistance to adopting data-driven methods. The two main reasons for the resistance was a lack of understanding of the organisation’s data strategy and a lack of education about the positive impact data brings. This confusion is having a detrimental effect, curbing data culture transformation as a whole. 

Organisations who are starting their engagement with analytics need to make the data consumable and “bite-sized”. They must enable the process, so there is a limited gap between insight to action. In other words, take the relevant data to the person contextualised and in the workflow they want to use.

To the business world, this is relatively new. But modern-day already sees people, albeit rather unknowingly, gleaning insights from data in their daily lives in ways that benefit them. Smartwatches, for example, leverage data to let people know when it’s time to stand up and walk around to meet personal step goals. 

Popular apps and products have made the process of extracting value from data completely seamless, and in many cases, invisible. The data is so easy to consume because it’s right there and in the right context when it’s needed.

The takeaway? Users will adopt analytics when it works for them. That is, when it’s easy to access, non-intrusive, clear, direct, and provides the relevant insights that connect to their needs.

Advanced but accessible BI: engagement on their terms

Despite the numerous benefits of using data-driven insights for business decision-making,  the simple fact of the matter is that not everyone is interested in becoming a data analyst. This is especially true today in the post-pandemic business landscape where many teams are already understaffed and feeling burnt out. 

Knowledge workers would love to take action and make more decisions based on data, but not at the expense of their time. So continuous training to learn additional skills outside of their jobs just isn’t a solution.

To that end, BI tools must be user-friendly. And for a tool to be considered user-friendly, it has to shield the end-user from the complexities of the data. For example, business people think in business terminology. They think of opportunities, customers, pipelines, and revenue. So that’s how they need to be able to interact with data – on their terms and in a way that makes sense to them.

They don’t need to concern themselves with the complexities that underpin it all. Instead, they need the flexibility to ask questions in a natural way as opposed to a technical way. An advanced analytics tool like Sisense that provides the power to build custom-reporting dashboards offers the best of both worlds.

These dashboards can be tailored to the business needs and fit the questions that a variety of users may have. The visual dashboards also provide an at-a-glance summary of the data, which is perfect for keeping data easily digestible.

Making advanced analytics accessible brings an organisation one step closer to solving the user-adoption dilemma. And it helps create an engaged culture where end-users experience the value of custom dashboards for decision making. Dashboards can show users what they want to see, when they want to see it, and in a format that’s easy to understand.

Put simply, advanced analytics is one of the most powerful tools for business. There’s a treasure trove of information available to any company that’s willing to unlock it. That’s information that can provide a clear path to growth and profitability.

To dive deeper into advanced analytics and its benefit to business, download the How to Boost Your Business with Advanced Analytics ebook by Forecast here.

The post Ready to boost your business with analytics? This data expert reveals all appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2022/04/08/ready-boost-your-business-analytics-data-expert-reveals-all/feed/ 0
Reducing crime with better visualisation of data https://www.artificialintelligence-news.com/2022/03/16/reducing-crime-with-better-visualisation-of-data/ https://www.artificialintelligence-news.com/2022/03/16/reducing-crime-with-better-visualisation-of-data/#respond Wed, 16 Mar 2022 17:43:55 +0000 https://artificialintelligence-news.com/?p=11769 Effective policing relies on good data. The prevention and reduction of crime, particularly serious and organised crime, depends on law enforcement agencies being able to gain swift insights from the huge and increasing amount of information at their disposal. The problem, given the sheer volume and variety of that data, is where to look first.... Read more »

The post Reducing crime with better visualisation of data appeared first on AI News.

]]>
Effective policing relies on good data. The prevention and reduction of crime, particularly serious and organised crime, depends on law enforcement agencies being able to gain swift insights from the huge and increasing amount of information at their disposal.

The problem, given the sheer volume and variety of that data, is where to look first. So much of the data available to law enforcement data analysts and senior staff is unstructured. In other words, it doesn’t line up in an orderly fashion in a relational database or spreadsheet. Police forces collect data of many different types – images from CCTV, phone records, social media conversations and images, and so on. Tying that variety of sources together to achieve valuable insights is difficult.

It demands the very latest in data integration tools, able to aggregate all information of possible relevance and present it so that it delivers insights via a single, easy-to-use platform and allows correlations between datasets to be discovered. With today’s data visualisation techniques, a picture emerges from different data sets without time being wasted on wading through information. Organised criminals work fast and change tactics regularly. Time lost in elaborate and complex manual data searches can give them the chance they need to move on and evade detection.

Data visualisation is critical to today’s law enforcement efforts. It complements data analytics, converting information collected from various sources into a clear picture, displayed using familiar elements such as graphs, charts, and maps. By using natural language processing as well as artificial intelligence and machine learning capabilities, otherwise invisible patterns emerge.

An easily digestible view of data can help in several ways. Here are a few of them:

Interpreting visual data: The human brain can process visual data 60,000 times faster than it does text. Data visualisation gives law enforcement professionals a crucial edge because smart visual tools amplify human abilities and allow them to more easily spot anomalies or patterns in the data. They can also better understand operations, identify areas for improvement, and uncover missing evidence links for faster case resolution.

Deploying predictive analytics: Having access to predictive and prescriptive analytics means that law enforcement professionals can build and deploy statistical models that provide alerts when new incidents are likely to happen, with context on circumstances that require pro-active investigation. Data visualisation is core to this because it provides an easy-to-understand translation of machine learning models and presents actionable intelligence. Patterns can be spotted, giving law enforcers a critical head start. Simple visual techniques such as assigning a range of amber to red colours to areas of concern on a map are highly effective.

Sharing critical data: Data visualisation is not just of academic use to data scientists. It is useful for everyone in the law enforcement team, from officers on the street to supervisors and analysts in the office. Detectives investigating organised crime can use the visual output of these tools to see the connections between people, property and financial transactions within a crime syndicate without needing data science qualifications. Anyone can see what the data is saying. Different teams, indeed different police forces, can share information seamlessly without fear of system incompatibilities.

More than that, today’s tools can aggregate all the relevant information within and outside an agency and analyse it to deliver insights via a single platform. Crucially here, data can be handled in a secure manner so only those with the appropriate clearance can see it.

Managing tight resources: Law enforcers are always looking for more efficient resource allocation and better ways to juggle limited amounts of personnel and equipment. Badly organised resources can impact everything from crime clearance, departmental morale, and perception in the community. With a data visualisation platform, they can spot areas that need immediate and long-term attention. They can also see which crimes have the biggest community impact and therefore need the most resources.

Improving community relations: Data visualisation gives police a chance to connect with their communities, demonstrating the results of their work in a digestible and interactive form. They can showcase incident-rate trends, initiate awareness about emerging security concerns and foster community engagement. Sharing data builds trust and cooperation, making it easier in the longer term to gather evidence and solve cases.

The right platforms are available today to allow law enforcers to make faster and more accurate decisions. The insights derived from visual analytics are already helping keep law enforcement personnel and civilians safe, reduce operational costs and improve investigation outcomes.

The police are not in a position to share all of the successes they have enjoyed with data visualisation, but others can. For example, how the Scottish Environment Protection Agency (SEPA) uses data to address the threat of illegal polluters offers a close and relevant comparison.

SEPA has a vital role in working with government, industry and the public to ensure regulatory compliance with environmental rules. It has a range of enforcement powers which it can apply to ensure that regulations are complied with. However, enforcement relies on the ability to intelligently analyse data from multiple sources, on air, water and soil quality for example.

SEPA has millions of records dating back decades in a huge variety of formats and used to rely on manual collection, analysis and reporting of its testing samples to set alongside historic data to help spot pollution trends. With an analytics platform supplemented by data science and visualisation, SEPA has built a range of customisable solutions to address a wide variety of data-related tasks. Staff members carry visual analytics on a tablet wherever they go. No longer needing to write code or carry physical binders of data analyses, they can run data analytics on the spot and answer questions in the moment. Use cases can involve looking at pollutants, ecology and lab measurements, while others have covered industry compliance, laws and licences.

Just as it has done for SEPA, data visualisation can help law enforcers to identify never-before-seen patterns in data to make better decisions now and help steer future direction to resolve hidden challenges in their effort to reduce crime.

(Photo by Scott Rodgerson on Unsplash)

The post Reducing crime with better visualisation of data appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2022/03/16/reducing-crime-with-better-visualisation-of-data/feed/ 0
92% of product decision-makers say data and analytics is critical to success https://www.artificialintelligence-news.com/2022/02/07/product-decision-makers-data-analytics-critical-success/ https://www.artificialintelligence-news.com/2022/02/07/product-decision-makers-data-analytics-critical-success/#respond Mon, 07 Feb 2022 14:42:50 +0000 https://artificialintelligence-news.com/?p=11661 A new study, “The Business Intelligence Landscape,” commissioned by Sisense and conducted by The Harris Poll among product decision-makers, highlights that companies offering data and analytics to their customers have a competitive advantage and reap the benefits of increased revenue and loyalty.  However, the report shows there are some challenges to overcome. For example, 53%... Read more »

The post 92% of product decision-makers say data and analytics is critical to success appeared first on AI News.

]]>
A new study, “The Business Intelligence Landscape,” commissioned by Sisense and conducted by The Harris Poll among product decision-makers, highlights that companies offering data and analytics to their customers have a competitive advantage and reap the benefits of increased revenue and loyalty. 

However, the report shows there are some challenges to overcome. For example, 53% of respondents wish their analytics experience was more aligned with user-friendly entertainment applications, such as Netflix and Spotify.

Analytics drive business value

92% of product decision-makers say that data and analytics are critical to the success of their businesses. More than 4 in 5 (86%) say that offering data and analytics to their customers plays a critical role in not only the satisfaction of those customers but also in terms of building and retaining loyal customers.  

And for a direct tie to the bottom line, 96% note that an increase in average selling prices would be possible with personalized and customized analytics, with 46% noting they could charge 10-19% more for their products and services because of the analytics they provide.

Benefits of embedded, actionable personalised intelligence

Nearly all decision-makers (94%) feel companies that they are able to deliver data and analytics at the right time to the right people are considered innovative.

Other key points:

  • 96% believe their customers are interested in having AI-driven insights that can provide actionable, personalised intelligence in the context of their activity
  • 97% think their customers are interested in analytics provided in the context of the task the user is completing
  • 97% note that customers want analytics more personalised to the specific end user
  • 96% feel customers want data customised to their industries or consumer activity
  • 95% think their customers want interactive analytics
  • 56% believe that customers would find prescriptive analytics most useful 

Looking to the future, 81% of product decision-makers say that if they could provide their customers with personalised data and analytics, it should be provided by embedding those into communication software or platforms, custom-built apps or off-the-shelf business or SaaS applications.

Current barriers to success with analytics

While the numbers above speak to the opportunity, 83% of decision-makers think their customers currently are making decisions without proper data and analytics at least sometimes. 

However, product decision-makers cite barriers in being able to deliver such offerings. 41% of decision-makers cite legal and compliance requirements as an issue. 38% say their customers have difficulty accessing information. And this access may in large part be due to the fact that 92% of decision-makers deliver data and analytics to customers via non-embedded methods such as email and dashboards, requiring them to disrupt their workflows to go elsewhere for critical information.

Predicting what’s next for analytics in 2022

“The results from this third-party study are directly in line with what we are hearing from our customers and see in 2022 for analytics. Firstly, we expect organizations will redefine what it means to build a ‘culture of analytics’ by bringing insights to workers in a more digestible way, such as embedding them into regular processes so no new skills are required. Secondly, most data-driven organisations will combat tool fatigue by bringing data to workers where they are, directly within their workflows,” said Sisense Chief Product and Marketing Officer, Ashley Kramer. 

“And lastly, we see automation turning descriptive analytics, that demonstrate what already happened, and predictive analytics, profiling what will happen, into prescriptive guidance, focusing on what the best course of action to take is to make smart, proactive decisions,” Kramer continued.

Editor’s note: This article is in association with Sisense

(Photo by Hunters Race on Unsplash)

The post 92% of product decision-makers say data and analytics is critical to success appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2022/02/07/product-decision-makers-data-analytics-critical-success/feed/ 0