For many CIOs who are expected to lead the charge towards the digital holy grail, the exercise can be a hair-raising challenge that requires them to navigate an unchartered IT minefield, with redundant and obsolete applications that have accumulated over the years threatening to blow the entire exercise out of the water. It’s essential that before embarking on a digital transformation journey, CIOs ensure they fully understand and then streamline the organisation's application portfolio.
Blockchain technology is young and changing very rapidly; widespread commercialization is still a few years off. Nonetheless, to avoid disruptive surprises or missed opportunities, strategists, planners, and decision makers across industries and business functions should pay heed now and begin to investigate applications of the technology. Blockchain ensures that data has not been tampered with, offering a layer of time stamping that removes multiple levels of human checking and makes transactions immutable. However, it isn’t yet the cure-all that some believe it to be.’
The coming year will be exciting as ever when it comes to technology. We live in exponential times and the more various technologies converge, the more exciting the opportunities become. The Internet of Things will drive the data deluge, which will help train artificial intelligence, which will result in better applications and more personalisation. New developments in chip design will continue Moore's law and result in more powerful machine learning algorithms, while decentralised solutions will bring back control to the consumer. 2020 will be the Year of Convergence. Here are the top seven technology trends for 2020.
Facial recognition isn’t as basic as taking two pictures and seeing if they match. Facial recognition algorithms create a mathematical representation of a human face called a face template by identifying landmarks on the face such as the nose and eyes and calculating the distance between those landmarks. It is, at its basic form, computing geometry. These equations represent the face which is then compared to other mathematical representations to find a match or a similarity score.
There is clearly no other way to succeed in devops than to maximize the automation of your entire release processes, from coding through production. At the heart of the maturity lies a continuous improvement mindset within the organization. Such a mindset should build, over time, a foundation of tools that fit the processes and people with their respective skillsets. The market doesn’t stand still; it continues to innovate around machine learning, AI, and software delivery automation capabilities.
Whether you are leading a small upgrade project for the latest software update or spearheading a companywide ERP redeployment initiative, identifying opportunities for automation and process improvement is critical for future success. Project managers play an integral part in driving these initiatives and making sure that each phase of the project life cycle is consistently adhered to. It is imperative to create a repository of historical data used in each project. Without the historical data and lessons learned, RPA and other advanced technologies will be challenging to implement and adapt.
Python is an interpreted, high-level, general-purpose programming language. Python's design philosophy emphasizes code readability with its notable use of significant whitespace. In this article, you will learn how to add and configure Black, pytest, Travis CI, Coveralls, and PyUp. We’ve set the stage for more secure code with more consistent style. Here’s our ten-step plan for this article. This guide is for macOS with Python 3.7. Everything works as of early 2019, but things change fast.
How does an organization help the self-serve advanced analytics model grow and thrive? Responsibility lies in a number of places within the enterprise. If an organisation incorporates analytics into its strategy and business decisions, it will encourage the use of these tools within the organization. When a middle manager or team member understands that the senior management team values analytics and expects to see data-driven decisions, each business unit and department will embrace the use of self-serve advanced analytics.
If machine learning and statistics are synonymous with one another, why are we not seeing every statistics department in every university closing down or transitioning to being a ‘machine learning’ department? Because they are not the same! A major difference between machine learning and statistics is indeed their purpose. However, saying machine learning is all about accurate predictions whereas statistical models are designed for inference is almost a meaningless statement unless you are well versed in these concepts.
A proof of concept (POC) is a popular way for businesses to evaluate the viability of a system, product, or service to ensure it meets specific needs or sets of predefined requirements. What does running a POC mean in practice specifically for data science? When it comes to the evaluation of data science solutions, POCs should prove not just that a solution solves one particular, specific problem, but that a system will provide widespread value to the company: that it’s capable of bringing a data-driven perspective to a range of the business’s strategic objectives.
The poor coordination of security measures may be due to subpar or ill-informed senior leadership. Cybersecurity is a prime concern for business leaders. Rightly so, as the dependence on IT uptime and resilience has never been greater. However, corporate boards need to move beyond awareness and rhetoric into action in order to reduce the risk exposure of their organization and ensure long-term success. In the digital era, virtually every board decision will affect the organization's cyber-risk posture. That's why cybersecurity should be a recurring item on board agendas and continually reassessed in terms of the broader risk framework.
The impact of surveillance capitalism and this new data economy will be tremendous, but the outlook for the future is still up in the air. if we continue with an unregulated, poorly designed variation of surveillance capitalism, we stand to reap the consequences of a pretty inhumane, prescribed reality. If we take advantage of the brilliant people and institutions that the digital economy has empowered and train them, and the rest of us, to develop, use, and care for a suite of ethics-driven, nuanced, solutions-oriented products, we could easily wind up the benefactors.
Symbolic AI involves the explicit embedding of human knowledge and behavior rules into computer programs. The practice showed a lot of promise in the early decades of AI research. But in recent years, as neural networks, also known as connectionist AI, gained traction, symbolic AI has fallen by the wayside. Being able to communicate in symbols is one of the main things that make us intelligent. Therefore, symbols have also played a crucial role in the creation of artificial intelligence.
P-Values are always a headache to explain even to someone who knows about them let alone someone who doesn’t understand statistics. In statistical hypothesis testing, the p-value or probability value is, for a given statistical model, the probability that, when the null hypothesis is true, the statistical summary such as the sample mean difference between two groups would be equal to, or more extreme than, the actual observed results. This post is about explaining p-values in an easy to understand way without all that pretentiousness of statisticians.
Machine learning is still in its early stages. Indeed, both software and hardware components are constantly evolving to meet the current demands of ML. Deploying Machine Learning is and will continue to be difficult, and that’s just a reality that organizations are going to need to deal with. Thankfully though, a few new architectures and products are helping data scientists. Moreover, as more companies are scaling data science operations, they are also implementing tools that make model deployment easier.
Now the technology cycle has thrown up edge computing but the shift is still in its infancy. The shift was partly created by the Internet of Things — the massive explosion in devices, all with their own processing power. This processing power was often under-utilised. The swing in the technology cycle to edge computing has also been driven by privacy concerns — the ability to store personal data on individuals’ own devices, rather than somewhere in the cloud.
With effective data ownership regulations, governments can ensure that businesses utilize confidential data ethically. However, only creating regulations is not enough. Users like us must be aware of the value of our data and avoid simply giving away our confidential data to businesses. In this manner, we can ensure that we own our data and our data doesn’t own us. Therefore, Governments must regulate data ownership to prevent privacy violations and ensure that businesses use modern technologies ethically.
Machine Learning comes with challenges that the Software Engineering world is not familiar with. Building experiments represents a large part of our workflow, and doing that with messy code doesn't usually end up well. When we extend scikit-learn and use the components to write our experiments we make the task of maintaining our codebase easier, bringing sanity to our day-to-day tasks. Learn how to extend Scikit-learn code to make your experiments easier to maintain and reproduce.
Internet of Things, or IoT, is defined as the network of physical objects, or “things” embedded with electronics, software, sensors, and connectivity to enable objects to collect and exchange data. Many Organizations today show interest in and demand for applying business intelligence (BI) to IoT data, systems, and processes. R&D and Marketing & Sales departments assign the highest levels of IoT importance, as do larger manufacturing, financial services/insurance, and technology organizations. One of the most valuable insights is how critical the role IoT champions or IoT Advocates are to the successful adoption of IoT technologies today.
Most of us know the value of customer management—the problem is that we’re using less-than-stellar tools or using them in a way that is less-than optimal. At the end of the day, customer management is about knowing what data to gather about your leads, keeping it up to date, and gaining insights from it in the fastest way possible. AI is a clear partner for CRMs and companies looking to build a more loving relationship with customer management and their customers both.
Can RPA be implemented without expensive and extensive modifications to existing systems? The answer lies in the evolution of RPA to what we know as ‘Connected RPA’, a wholly new approach that is transformational because it is quick to implement and doesn’t require any coding. In essence, Connected RPA brings forward a new generation of digital workers who can access and read the user interface of legacy systems to interoperate and orchestrate any third-party application.
Intelligent automation can help HR professionals make smarter decisions, help them get more done with less, and help HR shift its focus from manual, repetitive tasks to take on a more strategic role in the business – by innovating. Automation, learning systems, and artificial intelligence are all becoming key elements of HR practices and HR leaders and practitioners need to embrace these as ways to help your business achieve its goals. Intelligent automation can improve HR management in many ways.
AI has become so pervasive we often don’t even recognize it anymore. Besides enabling us to speak to our phones and get answers back, intelligent algorithms are often working in the background, providing things like predictive maintenance for machinery and automating basic software tasks. As the technology becomes more powerful, it’s also forcing us to ask some uncomfortable questions that were once more in the realm of science fiction or late-night dorm room discussions.
Businesses now require regular and sustained innovation, but this is problematic for many organizations more used to operating in safe mode. The situation is recoverable when leadership commits to building culture of innovation. This is especially important in IT, because it’s here where innovation now starts and ends. To this end, organizations are rushing to embrace Agile and DevOps – the practices of choice for rapidly delivering high quality software. But when it comes to culture nothing is easy. Here are five tips to help ensure success.