{"id":22516,"date":"2020-12-22T09:47:58","date_gmt":"2020-12-22T09:47:58","guid":{"rendered":"https:\/\/www.experfy.com\/blog\/what-is-ensemble-learning\/"},"modified":"2023-09-14T17:39:47","modified_gmt":"2023-09-14T17:39:47","slug":"what-is-ensemble-learning","status":"publish","type":"post","link":"https:\/\/www.experfy.com\/blog\/ai-ml\/what-is-ensemble-learning\/","title":{"rendered":"What Is Ensemble Learning?"},"content":{"rendered":"\t\t<div data-elementor-type=\"wp-post\" data-elementor-id=\"22516\" class=\"elementor elementor-22516\" data-elementor-post-type=\"post\">\n\t\t\t\t\t\t<section class=\"has_eae_slider elementor-section elementor-top-section elementor-element elementor-element-b93cfbc elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"b93cfbc\" data-element_type=\"section\" data-e-type=\"section\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"has_eae_slider elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-6687385\" data-id=\"6687385\" data-element_type=\"column\" data-e-type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-7621ae7 elementor-widget elementor-widget-text-editor\" data-id=\"7621ae7\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>The principle of \u201cthe wisdom of the crowd\u201d shows that a large group of people with average knowledge on a topic can provide reliable answers to questions such as predicting quantities, spatial reasoning, and general knowledge. The aggregate results cancel out the noise and can often be superior to those of highly knowledgeable experts. The same rule can apply to artificial intelligence applications that rely on\u00a0<a href=\"https:\/\/bdtechtalks.com\/2017\/08\/28\/artificial-intelligence-machine-learning-deep-learning\/\" target=\"_blank\" rel=\"noreferrer noopener\">machine learning<\/a>, the branch of AI that predicts outcomes based on mathematical models.<\/p>\n\n<p>In machine learning, crowd wisdom is achieved through ensemble learning. For many problems, the result obtained from an ensemble, a combination of machine learning models, can be more accurate than any single member of the group.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-69b103d elementor-widget elementor-widget-heading\" data-id=\"69b103d\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">How does ensemble learning work?<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-d8690d8 elementor-widget elementor-widget-text-editor\" data-id=\"d8690d8\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Say you want to develop a machine learning model that predicts inventory stock orders for your company based on historical data you have gathered from previous years. You use train four machine learning models using a different algorithm: linear regression, support vector machine, a regression decision tree, and a basic\u00a0<a href=\"https:\/\/bdtechtalks.com\/2019\/08\/05\/what-is-artificial-neural-network-ann\/\" target=\"_blank\" rel=\"noreferrer noopener\">artificial neural network<\/a>. But even after much tweaking and configuration, none of them achieves your desired 95 percent prediction accuracy. These machine learning models are called \u201cweak learners\u201d because they fail to converge to the desired level.<\/p>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-b3a2920 elementor-widget elementor-widget-image\" data-id=\"b3a2920\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t<figure class=\"wp-caption\">\n\t\t\t\t\t\t\t\t\t\t<img fetchpriority=\"high\" decoding=\"async\" width=\"696\" height=\"321\" src=\"https:\/\/www.experfy.com\/blog\/wp-content\/uploads\/2021\/05\/disparate-machine-learning-models.jpg\" class=\"attachment-large size-large wp-image-18219\" alt=\"disparate machine learning models What Is Ensemble Learning?\" srcset=\"https:\/\/www.experfy.com\/blog\/wp-content\/uploads\/2021\/05\/disparate-machine-learning-models.jpg 696w, https:\/\/www.experfy.com\/blog\/wp-content\/uploads\/2021\/05\/disparate-machine-learning-models-300x138.jpg 300w, https:\/\/www.experfy.com\/blog\/wp-content\/uploads\/2021\/05\/disparate-machine-learning-models-610x281.jpg 610w\" sizes=\"(max-width: 696px) 100vw, 696px\" \/>\t\t\t\t\t\t\t\t\t\t\t<figcaption class=\"widget-image-caption wp-caption-text\">Single machine learning models do not provide the desired accuracy<\/figcaption>\n\t\t\t\t\t\t\t\t\t\t<\/figure>\n\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-25a5c37 elementor-widget elementor-widget-text-editor\" data-id=\"25a5c37\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>But weak doesn\u2019t mean useless. You can combine them into an ensemble. For each new prediction, you run your input data through all four models, and then compute the average of the results. When examining the new result, you see that the aggregate results provide 96 percent accuracy, which is more than acceptable.<\/p>\n\n<p>The reason ensemble learning is efficient is that your machine learning models work differently. Each model might perform well on some data and less accurately on others. When you combine all them, they cancel out each other\u2019s weaknesses.<\/p>\n\n<p>You can apply ensemble methods to both predictions problems, like the inventory prediction example we just saw, and classification problems, such as determining whether a picture contains a certain object.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-ac31c77 elementor-widget elementor-widget-image\" data-id=\"ac31c77\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t<figure class=\"wp-caption\">\n\t\t\t\t\t\t\t\t\t\t<img decoding=\"async\" width=\"696\" height=\"392\" src=\"https:\/\/www.experfy.com\/blog\/wp-content\/uploads\/2021\/05\/ensemble-machine-learning-models.jpg\" class=\"attachment-large size-large wp-image-18220\" alt=\"ensemble machine learning models Ensemble Learning?\" srcset=\"https:\/\/www.experfy.com\/blog\/wp-content\/uploads\/2021\/05\/ensemble-machine-learning-models.jpg 696w, https:\/\/www.experfy.com\/blog\/wp-content\/uploads\/2021\/05\/ensemble-machine-learning-models-300x169.jpg 300w, https:\/\/www.experfy.com\/blog\/wp-content\/uploads\/2021\/05\/ensemble-machine-learning-models-610x344.jpg 610w\" sizes=\"(max-width: 696px) 100vw, 696px\" \/>\t\t\t\t\t\t\t\t\t\t\t<figcaption class=\"widget-image-caption wp-caption-text\">Ensemble machine learning combine several models to improve the overall results.<\/figcaption>\n\t\t\t\t\t\t\t\t\t\t<\/figure>\n\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-735f366 elementor-widget elementor-widget-heading\" data-id=\"735f366\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Ensemble methods<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-356ca64 elementor-widget elementor-widget-text-editor\" data-id=\"356ca64\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>For a machine learning ensemble, you must make sure your models are independent of each other (or as independent of each other as possible). One way to do this is to create your ensemble from different algorithms, as in the above example.<\/p>\n\n<p>Another ensemble method is to use instances of the same <a href=\"https:\/\/www.experfy.com\/blog\/ai-ml\/machine-learning-algorithms-in-laymans-terms-part-2\/\" target=\"_blank\" rel=\"noreferrer noopener\">machine learning algorithms<\/a> and train them on different data sets. For instance, you can create an ensemble composed of 12 linear regression models, each trained on a subset of your training data.<\/p>\n\n<p>There are two key methods for sampling data from your training set. \u201cBootstrap aggregation,\u201d aka \u201cbagging,\u201d takes random samples from the training set \u201cwith replacement.\u201d The other method, \u201cpasting,\u201d draws samples \u201cwithout replacement.\u201d<\/p>\n\n<p>To understand the difference between the sampling methods, here\u2019s an example. Say you have a training set with 10,000 samples and you want to train each machine learning model in your ensemble with 9,000 samples. In case you\u2019re using bagging, for each of your machine learning models, you take the following steps:<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-fac7197 elementor-widget elementor-widget-text-editor\" data-id=\"fac7197\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<ol type=\"1\"><li>Draw a random sample from the training set.<\/li><li>Add a copy of the sample to the model\u2019s training set<\/li><li>Return the sample to the original training set<\/li><li>Repeat the process 8,999 times<\/li><\/ol>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-cb068a0 elementor-widget elementor-widget-image\" data-id=\"cb068a0\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t<figure class=\"wp-caption\">\n\t\t\t\t\t\t\t\t\t\t<img decoding=\"async\" width=\"666\" height=\"539\" src=\"https:\/\/www.experfy.com\/blog\/wp-content\/uploads\/2021\/05\/bagging-sampling.jpg\" class=\"attachment-large size-large wp-image-18221\" alt=\"bagging sampling What Is Ensemble Learning?\" srcset=\"https:\/\/www.experfy.com\/blog\/wp-content\/uploads\/2021\/05\/bagging-sampling.jpg 666w, https:\/\/www.experfy.com\/blog\/wp-content\/uploads\/2021\/05\/bagging-sampling-300x243.jpg 300w, https:\/\/www.experfy.com\/blog\/wp-content\/uploads\/2021\/05\/bagging-sampling-610x494.jpg 610w\" sizes=\"(max-width: 666px) 100vw, 666px\" \/>\t\t\t\t\t\t\t\t\t\t\t<figcaption class=\"widget-image-caption wp-caption-text\">Bagging sampling draws samples from the training set and replaces them<\/figcaption>\n\t\t\t\t\t\t\t\t\t\t<\/figure>\n\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-227e58d elementor-widget elementor-widget-text-editor\" data-id=\"227e58d\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>When using pasting, you go through the same process, with the difference that samples are not returned to the training set after being drawn. Consequently, the same sample might appear in a model\u2019s several times when using bagging but only once when using pasting.<\/p>\n\n<p>After training all your machine learning models, you\u2019ll have to choose an aggregation method. If you\u2019re tackling a classification problem, the usual aggregation method is \u201cstatistical mode,\u201d or the class that is predicted more than others. In regression problems, ensembles usually use the average of the predictions made by the models.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-cbd72a7 elementor-widget elementor-widget-image\" data-id=\"cbd72a7\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t<figure class=\"wp-caption\">\n\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"696\" height=\"562\" src=\"https:\/\/www.experfy.com\/blog\/wp-content\/uploads\/2021\/05\/pasting-sampling.jpg\" class=\"attachment-large size-large wp-image-18222\" alt=\"pasting sampling What Is Ensemble Learning?\" srcset=\"https:\/\/www.experfy.com\/blog\/wp-content\/uploads\/2021\/05\/pasting-sampling.jpg 696w, https:\/\/www.experfy.com\/blog\/wp-content\/uploads\/2021\/05\/pasting-sampling-300x242.jpg 300w, https:\/\/www.experfy.com\/blog\/wp-content\/uploads\/2021\/05\/pasting-sampling-610x493.jpg 610w\" sizes=\"(max-width: 696px) 100vw, 696px\" \/>\t\t\t\t\t\t\t\t\t\t\t<figcaption class=\"widget-image-caption wp-caption-text\">Pasting draws samples from the training set and replaces them<\/figcaption>\n\t\t\t\t\t\t\t\t\t\t<\/figure>\n\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-3a6006b elementor-widget elementor-widget-heading\" data-id=\"3a6006b\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Boosting methods<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-87f2fa7 elementor-widget elementor-widget-text-editor\" data-id=\"87f2fa7\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Another popular ensemble technique is \u201cboosting.\u201d In contrast to classic ensemble methods, where machine learning models are trained in parallel, boosting methods train them sequentially, with each new model building up on the previous one and solving its inefficiencies.<\/p>\n\n<p>AdaBoost (short for \u201cadaptive boosting\u201d), one of the more popular boosting methods, improves the accuracy of ensemble models by adapting new models to the mistakes of previous ones. After training your first machine learning model, you single out the training examples misclassified or wrongly predicted by the model. When training the next model, you put more emphasis on these examples. This results in a machine learning model that performs better where the previous one failed. The process repeats itself for as many models you want to add to the ensemble. The final ensemble contains several machine learning models of different accuracies, which together can provide better accuracy. In boosted ensembles, the output of each model is given a weight that is proportionate to its accuracy.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-b311341 elementor-widget elementor-widget-video\" data-id=\"b311341\" data-element_type=\"widget\" data-e-type=\"widget\" data-settings=\"{&quot;youtube_url&quot;:&quot;https:\\\/\\\/youtu.be\\\/BoGNyWW9-mE&quot;,&quot;video_type&quot;:&quot;youtube&quot;,&quot;controls&quot;:&quot;yes&quot;}\" data-widget_type=\"video.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"elementor-wrapper elementor-open-inline\">\n\t\t\t<div class=\"elementor-video\"><\/div>\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-88c0577 elementor-widget elementor-widget-heading\" data-id=\"88c0577\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Random forests<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-5dcd4ac elementor-widget elementor-widget-text-editor\" data-id=\"5dcd4ac\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>One area where ensemble learning is very popular is decision trees, a machine learning algorithm that is very useful because of its flexibility and interpretability. Decision trees can make predictions on complex problems, and they can also trace back their outputs to a series of very clear steps.<\/p>\n\n<p>The problem with decision trees is that they don\u2019t create smooth boundaries between different classes unless you break them down into too many branches, in which case they become prone to \u201coverfitting,\u201d a problem that occurs when a machine learning model performs very well on training data but poorly on novel examples from the real world.<\/p>\n\n<p>This is a problem that can be solved through ensemble learning. Random forests are machine learning ensembles composed of multiple decision trees (hence the name \u201cforest\u201d). Using random forests ensures that a machine learning model does not get caught up in the specific confines of a single decision tree.<\/p>\n\n<p>Random forests have their own independent implementation in Python machine learning libraries such as scikit-learn.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-c8cf203 elementor-widget elementor-widget-heading\" data-id=\"c8cf203\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Challenges of ensemble learning<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-99f2761 elementor-widget elementor-widget-image\" data-id=\"99f2761\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"696\" height=\"392\" src=\"https:\/\/www.experfy.com\/blog\/wp-content\/uploads\/2021\/05\/random-vectors.jpg\" class=\"attachment-large size-large wp-image-18223\" alt=\"random vectors\" srcset=\"https:\/\/www.experfy.com\/blog\/wp-content\/uploads\/2021\/05\/random-vectors.jpg 696w, https:\/\/www.experfy.com\/blog\/wp-content\/uploads\/2021\/05\/random-vectors-300x169.jpg 300w, https:\/\/www.experfy.com\/blog\/wp-content\/uploads\/2021\/05\/random-vectors-610x344.jpg 610w\" sizes=\"(max-width: 696px) 100vw, 696px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-5de7fd4 elementor-widget elementor-widget-text-editor\" data-id=\"5de7fd4\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\tp>While ensemble learning is a very powerful tool, it also has some tradeoffs.<\/p>\n\n<p>Using ensemble means you must spend more time and resources on training your machine learning models. For instance, a random forest with 500 trees provides much better results than a single decision tree, but it also takes much more time to train. Running ensemble models can also become problematic if the algorithms you use require a lot of memory.<\/p>\n\n<p>Another problem with ensemble learning is\u00a0<a href=\"https:\/\/bdtechtalks.com\/2018\/09\/25\/explainable-interpretable-ai\/\" target=\"_blank\" rel=\"noreferrer noopener\">explainability<\/a>. While adding new models to an ensemble can improve its overall accuracy, it makes it harder to investigate the decisions made by the AI algorithm. A single machine learning models such as decision tree is easy to trace, but when you have hundreds of models contributing to an output, it is much more difficult to make sense of the logic behind each decision.<\/p>\n\n<p>As with most everything you\u2019ll encounter in machine learning, ensemble is one among the many tools you have for solving complicated problems. It can get you out of difficult situations, but it\u2019s not a silver bullet. Use it wisely.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<\/div>\n\t\t","protected":false},"excerpt":{"rendered":"<p>In machine learning, crowd wisdom is achieved through ensemble learning. For many problems, the result obtained from an ensemble, a combination of machine learning models, can be more accurate than any single member of the group.<\/p>\n","protected":false},"author":109,"featured_media":16987,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"content-type":"","footnotes":""},"categories":[183],"tags":[97,1157,92],"ppma_author":[1946],"class_list":["post-22516","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-ai-ml","tag-artificial-intelligence","tag-ensemble-learning","tag-machine-learning"],"authors":[{"term_id":1946,"user_id":109,"is_guest":0,"slug":"ben-dickson","display_name":"Ben Dickson","avatar_url":"https:\/\/www.experfy.com\/blog\/wp-content\/uploads\/2020\/04\/medium_8aaf6bea-c4c1-455f-8156-8007d70910f8-150x150.jpg","user_url":"https:\/\/bdtechtalks.com\/","last_name":"Dickson","first_name":"Ben","job_title":"","description":"Ben Dickson is an experienced software engineer and tech blogger. He contributes regularly to major tech websites such as the Next Web, the Daily Dot, PCMag.com, Cointelegraph, VentureBeat, International Business Times UK, and The Huffington Post."}],"_links":{"self":[{"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/posts\/22516","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/users\/109"}],"replies":[{"embeddable":true,"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/comments?post=22516"}],"version-history":[{"count":4,"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/posts\/22516\/revisions"}],"predecessor-version":[{"id":32984,"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/posts\/22516\/revisions\/32984"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/media\/16987"}],"wp:attachment":[{"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/media?parent=22516"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/categories?post=22516"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/tags?post=22516"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/ppma_author?post=22516"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}