Each business will need a documented process of how they will scrub or remove the personally identifiable information (PII) connected to that consumer, in all their systems if there is no legal right or obligation to retain it. This can be a daunting task, depending on how many systems and cross system shares that may be in place. This an area where Robotic Process Automation (RPA) may be the best answer. The first step in designing a Forget Robot is to document the details of all the places where data is stored.
This is a continuation of the three part series on machine learning for product managers.The Part I focused on what problems are best suited for application of machine learning techniques. This note would delve into what additional skill-sets a PM needs when building products that leverage machine learning. As in Part I, the core skill sets required of a PM do not change whether you work in a machine learning driven solution space or not. Product managers typically use five core skills — customer empathy/design chops, communication, collaboration, business strategy and technical understanding.
The term artificial Intelligence generated so much hype around it, feeding analysts, thought- leaders, technology giants and basically anyone with a smartphone to have a very clear opinion about this very broad term. Which ignited multi-layered discussions and focuses on a quite basic AI- good or bad? Good for humanity or disruptive? Will it bring to the growth of humanity, enabling to do things faster and smarter, or is this the beginning of the
Big Data has been around for a while and blockchain technology currently rides the hype wave. What results can the concoction of these two innovations produce? Sure enough, blockchain and Big Data are a match made in heavens. The real question nowadays is who will be the first to provide the most suitable and best trained AI/machine learning model operating on top of distributed, transparent and immutable blockchain-generated data layers. The business to do this will roll in investments and generate immense profits.
The first principle of building a great product using machine learning is to focus on user needs. One of the common misconceptions is that people think Machine Learning somehow fundamentally changes the skill-set of a PM. Machine learning is not an end unto itself. Machine learning is a tool to solve a real user need. Many people and companies that have a cool AI technology, think that the technology alone validates its usage. If you have a cool technology to apply, think about what problems could be solved, or what experiences can be enhanced through that technology.
Today our data scientists and data analysts are more like doctors who perform many functions themselves. We are fairly early in our evolution of roles to fulfill the end to end process of data analytics, and there is still tremendous opportunity to improve efficiency with better specialization of roles. Today we see the emergence of a new role: the data curator. By working with data engineers, data custodians, data analysts, and data scientists, the data curator develops a deep understanding of how data is used by the business, and how IT applies technology to make the data available.
Robotic Process Automation (RPA), Machine Learning (ML) and Artificial Intelligence (AI) get a lot of attention these days. After studying these technologies and finding ways to apply them appropriately in our business, it is clear that there are vast ranges of perceptions of what these tools do and how they can be applied by managers and by marketers. A lot of hype is being used to over sell benefits, and very little discussion takes place about the costs connected to these technologies.
The title explains it all. No intrigue. A comprehensive overview of big data uses cases. We are going to describe big data use cases, and we have summed up twenty industry-neutral examples that clearly show the practical value of big data. So, let’s talk business value.
Discussions about fact versus truth come up quite a bit these days, especially with the proliferation of fake news and the news media’s coverage of certain facts or non-facts. There are sites to remove political bias and interpretation from these “facts,” but why would we need such things? Is it because reporters actively make up information and deliberately lie to viewers? Or could it be that they simply tweak hard facts just enough to fit them into their preferred narrative?
The way that bots and AI will interact with our life is still in the early stages of development, but what we do know is that AI can help us collaborate smarter. We therefore need to change the conversation from humans need to understand machines to machines need to understand humans. AI can guide us in developing better, more meaningful relationships with those around us in a way that’s quick, convenient and intuitive. We might not always admit it, but our need for a true connection is more powerful than our need for convenience.
Most people who attempt to get hired as a data scientist fail. This article is to help clarify what is happening and increase the chances of landing your first data science job. Everyone looks the same, literally. How do you expect a hiring manager to put you in their top 5-10 list for the final round if you look just like everyone else in the applicant pool? Once you admit that your fancy resume is actually boring and you look just like everyone else, you can start making some meaningful changes.
Customers want goods made to order and delivered as soon as possible, in a way that suits their flexible lifestyles. That’s a byproduct of the mobile, app-driven, on-demand age. But for many organisations and their warehousing and logistics experts, those customer wants can shine a harsh spotlight on legacy business models. Today’s smart warehouses are increasingly rolling out transformation strategies that deploy sensors connected to the IoT – so that robots, workers, managers, and even smart vehicles, know the location of every item and can track them on their journeys.
In this age of big data and powerful commodity hardware, there’s an ongoing debate about node size. Does it make sense to use a lot of small nodes to handle big data workloads? Or should we instead use only a handful of very big nodes? If we need to process 200TB of data, for example, is it better to do so with 200 nodes with 4 cores and 1 terabyte each, or to use 20 nodes with 40 cores and 10 terabytes each? One reason we hear is that having all this processing power doesn’t really matter because it’s all about the data. If nodes are limited to a single terabyte, increasing processing power doesn’t really help things and only serves to make bottlenecks worse.
There is more than one way in which IoT can enable industries to accelerate growth, transform economies and achieve a new level of competitiveness. With IIoT, manufacturers can ensure greater efficiencies across the value chain, ranging from operations and services to engineering and product supply. The inherent benefits of IIoT, including asset optimization, smart monitoring, predictive maintenance and, most importantly, intelligent decision-making, are rapidly making it an irreplaceable technology.
Data science is currently very good at coming up with answers. It’s not very good at coming up with questions. I believe that requires data scientists to pay more attention to building non-technical skills, but I think it also requires us to build more tools that facilitate that part of the process. In fact, building the tools will contribute, in large measure, to building the non-technical skills.
Data professionals have to consider the environment around them when creating a data story. It’s not enough to find an issue and then start raising a red flag. Consider your audience and craft your message in a way that they can hear bad news, consider if others even consider the issue a problem, and then work with others to solve the issue.
Everyone has AI on their roadmap these days. Bottom-tier innovation verticals like HR, multi-level marketing, entertainment, fashion, medical, supply chain are even starting to talk about it. Despite the hype and excitement, the majority of companies that commit to tackling AI projects will fail. Even that $1M+ hire won't save you from failure. Here are some of the main reasons why your AI project failed or will fail. These have seemed to resonate well with others so I figured I would share.
In the world of exponential data growth, companies are turning to 2 jobs to solve some of their biggest problems, Data Analyst and Data Science. However, it’s becoming more apparent that the business world is unsure how to appropriately define the scope and differentiate between these roles. There are near identical skills required in both, but there is a key difference in what separates these roles. Businesses need to ensure they do not blur the lines.
Taking a passive approach to Business Intelligence (BI) is a mistake many companies today make. Their competitors mine data related to optimize their stake in the marketplace starting from their customers, and products all the way to market share and patterns of growth. But why are so many companies still so fearful of BI? Here are the top five myths debunked
Many of the RPA vendors have a well-rehearsed pitch and presentation clearly aimed at the global companies with tens of thousands of employees. RPA does still have impactful potential for SMB companies but the analysis and application needs to a lot more granular and explicit about avenues of costs savings beyond head-count reductions. In an SMB context, many of the process that should be automated will see FTE effort savings that will be less than 100% of 1 FTE.
One of the top challenges of IIoT is keeping valuable business data secure. Cyberattacks against IIoT systems and critical network infrastructure have severe consequences, putting world governments on high alert. Enterprises must take adequate precautions to manage and protect data related to IIoT or machine-to-machine security. By securing every necessary remote connection with VPN management, it will be possible for enterprises to stay ahead of future cybersecurity threats.
AI pioneers have provided us with a glimpse of and conditioned us to ambient AI making it hard to break up with each other. They have also set a very high bar on our expectations of what AI should do for our businesses. The point is that enterprises embarking on AI need to radically shift their approach to technology adoption and analytics. This is not a plug and play and bolt on strategy. It takes work to go from POC to a capability that comes close to our expectations of AI based on our consumer experience.
Policy approaches to education, skills training, employment, and income distribution should all now assume a post-AI perspective. This will require us to question some of our most deeply held convictions. A world where autonomous AI systems can predict and manipulate our choices will force us to rethink the meaning of freedom. Similarly, we will also have to rethink the meaning and purpose of education, skills, jobs, and wages.
Most of the traditional large system integrators focus on the higher end of the technology stack – connectivity is usually assumed to exist. But the core of their offers focusses on analytics and business process integration with legacy ERP and other systems. Newer connectivity solutions such as LPWAN are not yet fully integrated into their domain expertise. The current need for IoT network integrators is driven precisely by this need – the capability to determine the right network connectivity solution for lower tier access in terms of power consumption and cost is nascent, and this very capability is what determines whether new projects can be delivered at a reasonable ROI.