As the use of AI enabled platforms continue to grow within all industries and markets, we will also see a greater level of AI platforms being adopted by retail companies. There are four factors that will influence the adoption of AI in Retail:

Think Big, Start Small

Retailers who adopted AI early are already benefitting from this innovation. Retailers that are new to using AI in their day-to-day operations will benefit from starting with the “basics.” It is important for retailers to remember it is not about solving all their problems at once, but to focus on fixing one problem at a time. People often get caught up in the task at hand or distracted with too many problems. It is very important to remember the strategy of “test and learn.” Make one adjustment towards personalization for the consumer and test it before you move on to the next.

AI Boosts Conversions, Revenue, and Customer Satisfaction

IDC Retail insights predicts by 2019 40% of retailers will have developed a CX architecture supported by AI. IDC forecasts customer satisfaction scores to rise by 20%, employee productivity to rise by 15%, and inventory turnover to rise by 25%. This is all going to be possible due to AI paired with AR and IoT data which will give retail companies the ability to hyper-personalize each customer’s experience.

Mobile Devices Will Help AI Flourish

The vast majority of the population has access to mobile devices and conducts most their activities on these devices. This allows for a huge adoption in AI on this platform. The data collected from all these mobile devices will allow companies to improve their customer’s experience. One company that is already successfully implementing an AI platform is Starbucks. One thing their AI platform does is recommend specific orders for customers based on their prior purchase history. AI will play a big role influencing AI adoption in retail.

The Lack of Knowledge and Cultural Biases Will Hold Back the Adoption of AI

Two problems many companies face is the lack of knowledge and their cultural readiness for innovation within the company. These become a problem when people within the company are afraid to innovate new technology they don’t understand. Another hurdle retailers have to jump over is the cost of implementing an AI platform into their existing system.

Download the full report HERE

Building a data science team may seem like a daunting task, especially in this market where talent with practical experience is scarce but interest and buzz in the field is extremely high. Here are a few tips for building and running a successful Data Science team.

Find the Right People

What roles must you fill for a complete data science team? You will need to have a variety of people with different types of skills:

  • Data Scientists who can work with large datasets and understand the theory behind the science. Most importantly they need to be capable of developing predictive models that fit your business context.
  • Data engineers and software developers that understand architecture, infrastructure, and distributed programming.
  • Other roles include a data solutions architect, data platform administrator, full-stack developer, and designer.

Build the Right Processes

The key thing to consider with data science workflows is agility. The team needs the ability to access and work with data in real time. The team then needs to be able to understand business problems and opportunities in the company and implement data solutions that solve those problems or facilitate growth. Make sure they are not handcuffed to slow and tedious processes, as this will limit effectiveness and make it harder to retain top talent.

Finally, the team will need to have a good working relationship with heads of other departments, and clear executive support, so they can work together in agile multi-disciplinary teams to deploy solutions that really benefit the business and will ultimately be adopted by business users.

Choose the Right Platforms

When building a data science competency, it is essential to consider the platform your company is using. A range of options is available from open source to paid services from major cloud providers and innovative startups.

We recommend you maintain some flexibility in your platforms because business and technology moves fast, and you don’t want to tether your team to a tech stack that could become a limitation to their growth and flexibility. Hybrid architectures that utilize the right technologies for the right applications are ideal. Talented architects should be familiar with many different technologies and methods and understand how to select the right components for current and future use cases.

Take Your Time

Most importantly you don’t want to rush and choose the wrong people and platforms or not have quality processes in place. Make sure to take your time to create a team that will work well together, has complementary skills, understands your business, and can deliver successful outcomes that get adopted by the business.

Ensure the Team’s Success

Once you have assembled the right team here are 5 things to keep in mind to maximize the impact they can have as they start building data-driven solutions to give you a competitive advantage:

Discoverability

Data science teams that are not practicing discoverability are writing scripts to solve different functions and not publishing them in a common place. In order for anyone to access this information it usually requires contacting one of the data scientists directly and having them send it over in a presentation or excel sheet. This is both a waste of time for the person asking, and the data scientist that has to devote time to re-delivering rather than innovating. A team that is successfully practicing discoverability publishes their work in a central location where everyone in the organization has access to it.

Automation

The difference between a data science team that does not focus on automation and one that does is quite simple; the team that does not focus on automation is continuously producing results by hand instead of letting their models do the work for them. The team that focuses on automation spends their time maintaining the pipeline instead of manually re-running their workflow. While automation can take more time up-front, it pays off in multiple ways when done successfully. Automated pipelines make it much easier to build the insights and outcomes from your team’s efforts into business processes, continuously increasing the ROI on your data science endeavors.

Collaboration

A data science team that focuses on collaboration and consistency will benefit significantly compared to those that do not. Collaboration allows for the strengths of individuals to help the group as a whole. Collaboration is much easier to achieve when there is consistency between how code is written from individual to individual. Those teams that do not have a shared set of standards will have trouble collaborating and end up with individual quality standards, versioning habits, and coding style. Collaborating with business stakeholders and users is also an important component of successful data science deployments. Great models are useless if no one can use them, users don’t trust them, or they were developed without the correct business context.

Empowerment

Data science teams that agree to use the same stack of tools are better at discoverability and collaboration as well. The trick is to get the right tech stack for the needs of everyone in the team. A team that does not have a cohesive tech stack will suffer from an over-abundance of data storage and analysis tools and a lack of collaborative cohesion. Empowering your teams with tools that make their jobs easier and facilitate the collaboration and automation will set them up for success and aid in job satisfaction.

Deployment

There is a big difference between workflow being “in production” and “produced.” Work that is “in production” means failure is ok and work that is “produced” or finished means failure is not ok. A good data science team will make sure to put tools into production that can be trusted and used to benefit the stakeholders. They will not create things just because they can, instead focusing on the problems that actually need to be solved and making the results digestible and usable by the business.

Data Science as a Service

There are also many options for engaging external expert teams that can accelerate adoption of Data Science while also preparing your organization for growing in-house capabilities.

The same principles apply to service providers and consulting teams. Make sure they are equipped to build continuous value for your organization, not just deliver one-time results.

Sources:

https://mapr.com/blog/how-build-data-science-team/

http://lineardigressions.com/episodes/2017/9/24/disciplined-data-science

There are many trends coming to the foreground of AI, machine learning, and business intelligence. This article will be talking briefly about some of these trends and why they are coming to light. A link to the in-depth report by Tableau can be found at the bottom of the page.

Do not Fear AI

Is AI the destructive force that will destroy all jobs and the world as we know it? The media and Hollywood have depicted AI as such, however this is not the case at all. At this point in time, machine learning and AI has become a daily tool in business intelligence. These tools are giving time back to their human Analyst counterparts. Analysts are using machine learning and AI software to better understand their company’s data in a more timely fashion.

Liberal Arts Impact on AI

In the upcoming months Liberal Arts will be playing a bigger role in the building of AI and machine learning software. Data scientists are realizing they not only need the data analyzed to be accurate but also tell a story that anyone can understand, including those without a technical background.

NLP (Natural Language Processing) Promise

NLP refers to the way we interact with the AI through the UI (user interface). Companies are beginning to want all level of employees to have access to the data provided by their AI software. The problem many of these companies face is that most of their employees do not have a technical background and no idea how to query a piece of data. This is where NLP comes into play; AI software can process queries in natural language instead of using specific codes. e.g. I want to know the Sales for Item “001”  by day at Store “2045”

Multi-Cloud Capabilities

The move to multi-cloud storage is becoming an ever-increasing desire within big companies. Companies don’t want to be limited to one storage method that may not provide the best performance for their data needs. Though multi-cloud architecture has many benefits, it also has its costs, one of which being the actual overhead cost of running this type of multi-cloud environment.

Rise of the CDO (Chief Data Officer)

With understanding data and analytics becoming a core competency more and more companies are creating a position of CDO. This position allows them to join the C-suite with the CEO, CTO and CIO. This new position gives the CDO the ability to attend the C-level meetings and actually affect change within the company. Due to the creation of the CDO position, companies are showing just how important it is to understand their data and manage it successfully.

Crowdsourcing Governance

Crowdsourcing governance is a fancy term for allowing customers to shape who has access to specific data within a company using self-service analytics. It gets the right information into the right hands while keeping that same information out of the wrong hands.

Data Insurance

Data is more valuable than ever. We have seen countless data breaches over the last few years and will most likely see many more. With customer data becoming so valuable we are going to see a rise in data insurance. This insurance will protect companies from being responsible for a breach of their customer data.

Data Engineering Roles

As data analysis software continues to grow in use and value we will see a rise in data engineering roles over the next several years. Data engineers will begin to transform from more architecture-centric roles to a more user-centric approach within their organizations.

Location of Things

“Location of things” is in connection to IoT (internet of things). We are seeing companies trying to capture location-based data from IoT devices. Gartner, predicts there will be 2.4 billion IoT devices online by 2020. The problem is that companies are trying to collect and compile all this location data within their internal data structures, while most of these structures are not capable of accepting that quantity of data. This is going to lead to great innovations for IoT data storage.

Academics Investments

With data analytics growing in all industries the demand for future data scientists will continue to grow. Due to this high demand for data engineers and data scientists we will begin to see more and more universities offering some sort of academic training in these categories over the next several years.

 

Read the full report by Tableau Here:

https://www.tableau.com/reports/business-intelligence-trends#ai

A huge thanks to everyone that came out to our first workshop event in Bentonville, AR! We had so much fun doing a Walmart Store 1 walk-thru (did you know there is a Dunkin Donuts inside!!), visiting customers, and holding our first Analytics Edge Workshop at the 21c Hotel.

The Workshop was incredibly well attended and we had some great dialogue with suppliers and others in the community about the potential of AI for business process automation and how easy access to Analytics can give anyone in an organization the opportunity to be strategic in their roles.

We also want to give a big thanks to John Daly of Sony Pictures Home Entertainment for joining us as a guest speaker, it is always inspiring to hear him talk about the transformation they have been able to achieve in such a short time!

We look forward to being back in Bentonville very soon!

In a 2016 research reportWhy Artificial Intelligence is the Future of Growth, Accenture found that adoption of artificial intelligence tech across all industries may double economic growth rates by 2035. AI investment is expected to increase labor productivity by 40 percent. In fact, 70 percent of executives say they plan to “significantly increase” AI investment.

In the realm of inventory and supply-chain management, AI adoption, specifically the use of optimization algorithms, is revolutionizing inventory agility – reducing stock depletions and maximizing stock levels.

“The use of AI in supply chains is helping businesses innovate rapidly by reducing the time to market and evolve by establishing an agile supply chain capable of foreseeing and dealing with uncertainties,” says Accenture Managing Director Manish Chandra. “AI armed with predictive analytics can analyze massive amounts of data generated by the supply chains and help organizations move to a more proactive form of supply chain management.”

Supply chain processes generate giga-tons of data, and AI can deploy predictive analytics to make sense of it all. Freshly updated and analyzed data then builds a solid foundation when it comes to real-time vision and information flow. Every key player across the supply chain is empowered with the best data and maximizes it accordingly.

AI is no longer an “ain’t-it-cool” innovation in the industry but rather a necessity. With the erosion of the brick-and-mortar model and rise of real-time consumer expectations, supply chain/inventory management practices must embrace machine learning that far outpaces the speed of human thought and action. Consider these stats from the 2017 MHI Industry report concerning the speed of supply-chain transactions from just one e-tailer on Black Friday:

“A reported 426 orders per second were generated from the website throughout the day. That equates to over 36 million order transactions, an estimated 250 million picking lines at the distribution centers (DC), 40 million DC package loading scans, 40 million inbound sortation hub scans, 40 million outbound sortation hub scans, 40 million inbound regional sortation facility scans and 40 million outbound delivery truck scans.”

How should industry leaders respond? The answer, according to the report, is clear. Supply-chain companies must embed “analysis, data, and reasoning into the decision-making process. Position analytics as a core capability across the entire organization, from strategic planners through line workers, providing insight at the point of action.”

As Accenture economic research director Mark Purdy concludes, companies that survive will fully invest in the potential power of AI going forward: “To fulfill the promise of AI, relevant stakeholders must be thoroughly prepared – intellectually, technologically, politically, ethically and socially – to address the benefits and challenges that can arise as artificial intelligence becomes more integrated in our daily lives.”

Analogue: Scalability in Data Usage

At the intersection of big data and machine learning are patterns and analyses that reveal trends and causes. To use healthcare as an example, sensors built into wearable medical devices open windows to improved, individualized healthcare based on a rapidly expanding set of clinical, lab, physiological, and personal data. (A patient diagnosed with hypertension might wear a device that sends information to an application that detects ongoing changes in blood pressure, respiration, or other conditions in real time and alerts a physician when anomalies occur.)

Predictive data technology moves past the goal of gaining insights and into the realm of insights on insights: namely, choosing the trends that require action. If the information received from the wearable monitor is utilized as cross-channel data, the challenge becomes making sense of the insight gained from the data and selecting the appropriate action. With this, the perspective may move from a simple focus on the instantaneous symptoms and treatment of hypertension to a holistic view of the patient’s respiratory, renal, and other systems’ response to standardized treatment.

The Human Factor

The relevance of obtaining cross-channel data from a hypertension patient is most apparent in the universal desire for individualized care. Scalable machine learning searches for efficient algorithms that can work with any amount of data and detect hidden insights. These insights yield logical, adaptive reasoning in performing specific actions, without consuming greater amounts of computing resources. Limits do exist, but predictive data technology adds another dimension to the interpretation of vast data sets. One that, in a business context, means greater efficiency and more thorough self-evaluation on a global scale.0

In the marketplace, insights gained from cross-channel data emphasize the individual’s ability to change. While individuals may defy—with varying levels of deliberateness—predictability, machine learning and predictive data technology take an unrestrained, multi-dimensional view of preferences, real-time behavioral patterns, and possible intent. Thus the view of the “customer journey” is expanded: and a mass of stops at a big box store from which a correlation would have normally been determined in retrospect is now a targeted real-time marketing effort—with the intuition to make progressively better use of progressively expanding data.0

Moving to a New Meaning

Terms like “segmentation analysis” and “adaptive marketing” are themselves harbingers of a system that will soon replace the marketing philosophies of old. However, these new practices may themselves prove to be stepping stones to an even broader view of personalized marketing. Real freedom from scale is measured over time: through predictive data technology that offers personalized strategies for small businesses, large businesses, and corporations as they grow. This new outlook recognizes the consumer’s awareness of the marketplace and the complexity of their decisions, providing insights into profit margins based not only on the instantaneous relationship between product and cost, but also by an adaptive view of long-term customer behavior and loyalty.

There are many trends coming to the foreground of AI, machine learning, and business intelligence in 2017. This article will be talking briefly about these trends and a link to the in-depth report by Tableau can be found at the bottom of the page.

BI (Business Intelligence) the New Norm

In 2017 we will see a trend of more and more companies using modern business intelligence, allowing analytics to be performed by all employees, not just data scientists and engineers.

Collaboration between Machines and Humans Strengthens

The collaboration and sharing of data is going to move from one-direction, spreadsheets and emails, to an interactive flow of data between multiple parties and their live data stream.

Data Will Become Equal

All data will be equally accessible and understandable. We will be able to access all our data without the worry of it being stored in the same format.

Anyone will be able to Data Prep.

Just as self-service analytics is becoming more accessible to non-technical employees, so will the ability to understand and prep data without the need of a technical background.

Imbedded BI is Allowing Analytics to Grow Everywhere

Business applications like Salesforce are placing analytic tools in the hands of people never before exposed to data. These tools are extending the reach of analytics in our day-to-day lives and we most likely are unaware that we are using them.

Work with Data in a Natural Way

In the next year we will see people being able to access and communicate with their data in a more natural way. We will see this more with the integration of natural language interfaces within AI networks.

Cloud Based Analytics

With data being stored in the cloud we will soon see analytics being conducted there as well. Cloud analytics will be faster and able to scale at a much quicker pace.

Data Literacy will Become a Necessity 

With Data analytics and predictive analysis moving to the mainstream we will see a need for all level of employees needing to be able to read and understand their company’s data.

 

Read the full report by Tableau Here:

https://www.tableau.com/learn/whitepapers/top-10-business-intelligence-trends-2017