facebook-pixel

381 Projects that match your criteria

Sort by:

Extracting Textual Data using Machine Learning and Creating a Rules Engine

We are looking for engineering help, under the guidance of our CTO, for the following work to support an ongoing project:

  • Extracting specific textual data from content using machine-learning (TensorFlow and so on) such that the extraction gets better. A training data set can be provided. 
  • Creating a rules engine for use with open source search software, such as Elastic Search, Lucene, or Solr. The rules engine will cause the search engine to override default search results and return specific data for queries.  The nature of these queries will include geolocation data, such as long/lat and zip code.
  • Data scripting support, including some ETL moving flat files from staging to production in MySQL, data aggregation, and sheparding scripts to make sure they aren't failing (looking at log files, restarting as needed). Light DevOps stuff.
  • Helping to automate data validation to ensure data moved by scripts matches source (using some level of fingerprinting a small subset of data and ensuring it matches after ETL). 
  • System testing for quality assurance.

Ideally, this person or team will have experience with AWS, Python, search technology, machine-learning, and an interest in data fusion architectures - and doing interesting and fascinating work.

Machine Learning
Analytics
Amazon Web Services

$80/hr - $150/hr

20 Proposals Status: CLOSED

Net 7

Client: V********

Posted: Jun 02, 2017

Algorithm to Automate the Identification of Orbital Positions/Frequency Bands from a Continuously Updated Dataset

Yes, it's rocket science.  Yes, it's hard, but do you want to help change the world? To make a disruptive company even more disruptive?  17 years ago little old ManSat from the Isle of Man changed how radio frequency licenes we sought by applying transparent commerical business practices to a logical, yet byzantie process at the ITU.

Today, we need your help to take this one step further by applying machine learning to this process as a next logical step.  If we get it right, this will mean more people on line, more commerce, and more communcations for all on the planet. 

The International Telecommunications Union (ITU) maintains a database of all Geostationary and other orbital spectrum and associated satellite flings. This database is called the Master International Frequency Register, or MIFR for short. PhD level experts use specific ITU software to analyze this database that reference the Radio Regulations. 

Yet, the MIFR is simply a large database that is analyzed using a rigid set of rules, supplemented by a series of equally rigid calculations and equations relating to the power levels of specific satellites and frequency ranges in a logical fashion, to the data. Sound familiar?

We know the data. We know the rules. We know the calculations and equations. What we want is the ability to use machine learning to do all of this, thus freeing the time of our people to actually act upon the results of the data analyzed.  We need your help.

OBJECTIVES

A. Analysis of the International Telecommunications Union (ITU) Master International Frequency Register (MIFR) and ITU databases that record satellite filings filed with the ITU but not yet recorded in MIFR with a view to:

  • Identifying available satellite orbital positions and associated frequency bands that would allow for the deployment of new satellite services to a given part of the world (like points on a curve / circle)
  • As a part of the above provide a search facility of the ITU SRS database (within which MIFR data is recorded) to search for data on the basis of orbital position, satellite name, satellite filing name, frequency bands, service area, date of submission of filing etc.
  • Tasking the algorithms to apply the same IFIC criteria to examine the larger sets of data utilizing the same parameters to seek opportunities for filing and frequency use in the MIFR overlooked by others or not yet anticipated by others: to help us to identify unused or under utilized spectrum and gaps in the orbital arc.

B. Analysis of the bi-weekly Radiocommunications Bureau International Information Frequency Circulars (BR IFICs) and ITU databases (SNS and SNL) with a view to:

  • Preparation of responses to IFIC publications, i.e. identification of satellite networks published in the IFIC that may affect a specified satellite filing or satellite networks in operation;
  • Prepare a frequency coordination plans for a given filing
  • Algorithms could routinely run the numbers on the IFICs when received every two weeks from the ITU giving the same accuracy (or better) than a person doing same.

SUCCESS CRITERIA

A) The algorithm would allow us to identify available orbital positions/frequency bands to offer services to certain parts of the world;

B) Automated preparation of IFIC responses free of any errors.

DATA ASSETS

They data sets we use are referred to by the following acronyms: –SRS, SNS and SNL are available databases maintained by the ITU.

The ITU has also developed a series of software tools (licence free) with which to analyze them.

The software tools can be found here: http://www.itu.int/en/ITUR/software/Pages/spacenetwork-software.aspx

  • SRS: contains alphanumeric and graphic information relating to satellite networks and earth stations recorded in the MIFR or in the process of coordination in accordance with Section II of Article 9 of the Radio Regulations or published under the advanced publication of information procedure in accordance with Section I of Article 9.
  • SNS: contains Appendix 4 data of geostationary, non-geostationary, and earth station filings.
  • SNL: lists basic information concerning planned or existing space stations, earth stations and radio astronomy stations. It includes sections on Advanced Publication Information, coordination requests, notifications, Plans information and their related processing backlog.

Size and Timeliness of the Data 

The IFICs data is published biweekly. These databases range in size from 1 to 10 GB depending on the number of networks published.

Data Collection Mechanism

The IFICs data has to be downloaded from the ITU website every two weeks (requires a subscription). The SRS database is included in these downloads. The SNS and SNL databases are online.

PROPOSALS

Please provide your approach to automate the collection, storing and analysis of the data in the cloud.  We would like a simple system that generates predefined reports on a weekly basis or on-demand. We would also like to understand the number of hours this project would take to get an idea of the budget.

Aerospace
Analytics
Data Quality

$100/hr - $150/hr

Starts Aug 11, 2017

9 Proposals Status: COMPLETED

Client: M******

Posted: May 25, 2017

Predictive Analytics Frameworks for Revenue Planning

The Ask:

We need a consultant to apply predictive analytics to their core business (Talent Platform) for revenue planning purposes.  The core business generates revenue by seconding out our attorney’s to client locations where they do work to support then-house counsel of those clients ranging from maternity leave covers to M&A transaction support documentation. Their core business generates revenue from “engagements” spanning the course of days, months or years.  The engagements end date is flexible and can end early or “extend” to a future date at any given time per the client’s needs. However, there is a wide range of engagement LTV. The firm needs help predicting how their current (and future / to be won) engagements will perform based on historical analysis of past engagement outcomes (using the profile / characteristics of those engagements). 

We have put together initial research/data that the consultant will build off. The final deliverable will be a board-ready presentation that outlines the predicted future behaviorof the “book” (current engagements / mixed with assumption on future “to be won” engagements).

The Details:

1-2 week framework starting mid to late May 2017

The Audience:

The executive leadership team and business planning team will use these outputs and assumptions to support outward looking planning models

Business Concerns

The issue: Our “book of business” is a portfolio of active engagements at any given time with an estimated end date. Based on the estimated end date we can forecast how much revenue we will generate from those active engagements in the future.

However, the variance between the estimated end date and the actual end date is quite large. On average our engagements extend ~+40% against the initial estimate. While we see a consistent variance at a portfolio level if we were too look at an individual engagement level we would see a wide range of variances. Some engagements end earlier than expected and some engagements go +5X the initial estimated time frame. Our business planning team wants to do a “double click” and apply predictive analytics to each engagement to predict what the actual revenue from these engagements will be on our current “book of business”.

Why is this important?

Understanding how our book of business will perform for the rest of year (and in future years) is a mission critical piece of information for our business team in annual planning and determining sales staffing needs. We will have to build new business assumptions on top of our current book of business which then drives staffing and cost investments to support corporate growth objectives. In other words, whatever revenue is not generated from our current book must be won by sales teams to drive revenue growth.

What does success look like?

There are three main outputs of this exercise.

  1. A predictively analytic framework on our forward book of business that predicts our 2018 carryover revenue performance by engagement within +-5% degree of accuracy (total book forecast vs. actuals accuracy)
  2. A stand-alone model/segmentation of categorized engagement types with extension multipliers that can be refreshed going forward and used in future forecasting efforts
  3. A list of proposed data points that are currently not being captured that could increase the accuracy of this framework in the future

Data Assets

● All data is in excel

● Customer segmentation and existing customers/prospects data

● Historical actuals – engagement-level performance against +6,000 engagements from 2014-2017 ytd

● Engagement performance is captured in our “book of business” reports which includes

     ○ The business team (sales team) that won the engagement and is responsible for managing the end date estimates

     ○ The primarily legal specialty of the attorney working the engagement

     ○ The client name

     ○ The engagement name

     ○ The resource name (attorney name)

     ○ The estimated start and estimated end date (actual end date included when               engagement has ended)

     ○ Revenue generated per month

     ○ The billing type of each engagement (is it a fixed fee per month, day / or is it estimated hours to be worked (hourly) or estimated days (daily) rate)

     ○ The estimated utilization (hours, day, fixed fee) per month for each engagement

     ○ The geographical location of the engagement (driven by business team attribute)

Data Analytics

Historical approach: Historically, the main reason we would run an outlook on our book of business was to predict “carry over”. Carry over is the amount of revenue we have in each year that is generated from revenue sold/new sales in prior years. Towards the end of a given year, during business planning for the following year, we would need to assume how much carry over we would receive and then we would be able to make a statement on how much additional revenue (new business) we would need our sales teams to win. Our approach was to triangulate an answer using different methodologies.

First, we would look at historical carry over performance and use that historical range and apply that to the current year (with assumptions on how much our current year book would grow by the end of the year / usually using our corporate revenue forecasts).

Second, we would look at an engagement by engagement level and determine the average age of each engagement and apply a predicted age (push out engagement with end dates that fell short of the historical average duration) which resulted in an assumed revenue outlook for carry over. Lastly, we would apply a top down adjustment to the carry over forecasts for any large (outlier) engagements that would move the needle and we have better line of sight to via communication with the salesperson managing that engagement.

Legal
Pricing and Actuarial
Predictive Modeling

$5,000 - $10,000

Starts May 30, 2017

12 Proposals Status: COMPLETED

Client: A***** ***

Posted: May 15, 2017

Programmatic Advertising Visualization Application integrated with Vendor Software

We are seeking one or two experts to help us take wireframes/visual design and business requirements and create an analytics application by 9/20/2017. This application will visualize the different data about programmatic advertising performance – and allow users to explore change the visualizations through filtering).

We will go over the business requirements in more detail with selected candidates. 

Note that this app contains several different visualizations, which we have already designed, so we simply need to make the visualization interactive and integrated with the SaaS via their API and IDE. 

To do this work we are looking for one to two people who can collaborate together to:

  • Take the sample data and the visual design and enable it to work with data retrieved from a SaaS API and IDE
  • Use a programming language, of your choice, such as JAVA or Python to create the logic and backend to drive the interactivity of the visualization.
  • Connect your backend to the SaaS in order to get data from it in real-time (when requested by users)
  • Use D3.JS or similar to take the visual design we have and make it work interactively to our business requirements using the backend you (or your partner) have created.
  • Create a frontpage in JS/HTML that guides the user in selecting/loading the data at first, and then when the data is loaded displays the visualization within the same page. 


The final product will be a production-ready application that enable the user to access the frontpage upload/select their data, and then load their data, and view/interact with the resulting  visualization.  

It is important that the front-end, which the user will see, be well-designed and aesthetically pleasing. Fortunately for this project, we have already designed the visualization we want people to see as an output. Thus, if you feel like you can get us to the technical functionality we want, but don’t have the design skills, that’s okay! 

We still want to talk, and can get a designer to help us take it over the finish line.   
We are open to using alternative technologies – but prefer open source and free software. This opportunity, once successful, could yield many other projects from us, so we are looking for one or two people to work with in the long-term beyond this project.  For the right person, this will be a lucrative and career-building experience, working with well-known entrepreneurs and thought-leaders in the analytics space.

This project is one of 8 projects - so if the hired expert is successful there are future projects. 

Data Visualization
D3.js
Java

$10,000

11 Proposals Status: COMPLETED

Net 7

Client: V********

Posted: May 10, 2017

Reusable Package for NLP Analysis

1.Create portable docker environment (10 hours)

  • include all OS installs and python libraries need to run code 
  • ensure Cody and Josh can run code on the same code on their machines

2. Develop Architecture to Compose Analytics Module (5 hours)

  • review phase 1 scripts
  • design code refactoring architecture to modularize these scripts 

3. Generalize Translation API (10 hours)

  • Ensure Support and Extensibility for Korean, Japanese, Chinese 

4. Refactor Frequency Analysis ( 10 hours)

  • Rewrite code to expose key functionalities and configuration parameters
  • Test new module

5. Refactor Clustering Analysis (10 hours)

  • Rewrite code to expose  

6. Testing and Document Overall Library Package (5 hours)

  • Document interface functions and provide examples in report

Milestones:

1. Define Key Interface Functions (Deadline: 5/31/2017)

  • Review definition of interface functions with Cody and Josh

2. Status Update (Deadline: 6/15/2017)

  • List and review any issues from code refactoring

3. Deliver Packaged Code (Deadline: 6/30/2017)

  • Provide report, packaged code and docker container

 Note: this project is being awarded to the same expert who performed the first phase.

Consumer Goods and Retail
Natural Language Processing
Analytics

$120/hr - $125/hr

Starts May 08, 2017

1 Proposal Status: COMPLETED

Net 30

Client: A*****

Posted: May 08, 2017

Data Analyst for Multinational Financial Services Company

We are a financial services company that fuels small business success by providing fast and easy access to working capital.

We are looking for a Data Analyst with financial services experience who is highly analytical and has expert level experience in Excel and financial modeling. 

The candidate must be able to spot errors on the page when looking at numbers to produce quality and reliable results of which leadership will be making major decisions on.

This is an on going relationship and since there are new projects which come up on the fly and there are recurring monthly deliverables, the analyst needs to be easy to get in touch with and have a fast response time in general to have this be a good fit. 

The analyst would be working mainly with the CEO, but also with key management personnel.  

Responsibilities:

  • Create and maintain financial models that accurately predict the financial and operational results of the business under various circumstances. The Analyst will be taking various data points from different sources and aggregating to help make decisions, for example, what kind of loans to make to what group and where, what employee is producing the most based on a number of data points and why, or even given a proposal from an outside capital source and build a model or provide analysis in excel to see if its worth pursuing. 
  • Support specific business units, providing monthly variance analysis reporting for our credit facility, budgeting, forecasting, cost benefit analysis and other decision support.
  • Perform other ad hoc analysis as needed.

Qualifications:

  • Strong skills in Excel for data analytics
  • Excellent verbal and written communication skills
  • Accountable for deadlines agreed upon and committed to excellent work
  • Accomplished in financial services and excel
  • At least 5 years of experience in working independently building dynamic models and data reporting
  • Bachelor's or Master's Degree in Finance, Accounting or related field required
  • Experience in business lending services preferred
Financial Services
Finance
Data Analysis and AI Tools

$60/hr - $100/hr

17 Proposals Status: IN PROGRESS

Client: G******** *******

Posted: May 08, 2017

UI Design of Collaboration Tool

We are looking to hire a designer to design a new screen for an existing collaboration tool (web application).  The screen we would like designed exists currently as an Excel document, and the collaboration tool already has an existing design to follow.  We are looking for someone with both UI/UX design and Adobe Illustrator experience.

We will be able to provide:

* The Excel document showing the new screen we want designed

* The design assets (.ai file) of another screen from the collaboration tool

* The design standards document which was used to design the other screens in the collaboration tool

Please include references or links to your previous work or portfolio along with your qualifications when applying to this project.

Adobe Illustrator
Software and Web Development

$50/hr - $100/hr

Starts May 10, 2017

1 Proposal Status: COMPLETED

Net 30

Client: C*******

Posted: May 01, 2017

Python Developer for a Machine Learning Application

We are looking for a Python Developer responsible for helping to develop a machine learning application. Your overall task will be working with experienced data scientists to test different methods of solving a supervised classification predictive analytics problem.  

Sub-tasks are: development of data ingestion and pre-processing code, running multiple models (generally programs from scikit-learn package), evaluating the fit of the models, and preparing to port the code into an HDFS production environment.  Expert facility with Python 2.7 programming and previous experience with machine learning/predictive analytics is required.  

Additional desired qualifications are explicit familiarity with Python package scikit-learn and Apache Spark under Azure/HDInsight environments.

The development environment is Anaconda / Jupyter iPython notebooks.

Media and Advertising
Machine Learning
Predictive Modeling

$90/hr - $125/hr

Starts May 08, 2017

45 Proposals Status: COMPLETED

Client: V*** ********

Posted: Apr 27, 2017

Domo Expert for Debugging Cards and Verifying Data Sets

We have a client who already has data and cards set up in a dashboard linked via Infusionsoft and a few other API's, but all of a sudden when client logs in none of the cards are working now.  

We need someone to review the data sets & data flows and make sure everything is working again.

Once that is done we have on-going work for this client to build out more dashboards and connect more data.

domo
Analytics
BI Infrastructure

$90/hr - $150/hr

Starts Apr 27, 2017

7 Proposals Status: CLOSED

Client: P****** ******* * ***** ***** ***** ***

Posted: Apr 27, 2017

Data-Driven Global Philanthropic Platform Conceptualization with Requirements Definition

We are seeking proposals to engage services of a technical advisor able to give practical realization to a unique global online philanthropic venture (“the Vehicle”) - still at the conceptual stage.

The Platform

A "pass-through" Vehicle that leverages the latest technology with best-in-class accountability and transparency practices to unleash the catalytic potential of Philanthropy giving donors an unprecedented choice to direct funds strategically and effectively towards the world’s major humanitarian and developmental challenges.

The Vehicle intends to improve and facilitate impact-driven bespoke reporting, with healthy competition for receipt of funds driving organisations to improve governance and the quality of impact reporting. This will in-turn stimulate and encourage further donations through the Vehicle.

Sample Illustration of Initial Platform Mechanism: https://screencast.com/t/qaD6IAmprwf 

The platform should utilise Big Data to pull information from multiple sources and generate Automatic Reports for the donors as well as the recipient agencies. It will also have advanced artificial intelligence (AI) and smart algorithms to perform the following:

  • analyse data to showcase humanitarian needs based on humanitarian and development aid agencies input, which would feed-in and update the humanitarian priorities section on the platform in real time.
  • analyse and display trends to make it easy for donors to identify causes and beneficiaries of choice.
  • showcase the impact of the mass micro-donations of the retail donors.
  • generate reports on donor trends, which will assist beneficiary agencies to position their appeals based on donor interests and requirements.
  • provide real-time and balanced exposure to all humanitarian needs around the world, enabling donors to make informed decisions on most pressing issues and needs.

General Audience/ Users

1. General Audience

  • Host governments
  • Development institutions and international aid agencies

2. User1: Donors

  • Mass 'retail' donors
  • Ultra-affluent philanthropist
  • Mass affluent contributors
  • Main stream and more modest givers

3. User2: Recipient Entities

  • humanitarian and development aid agencies
  • Charities

Deliverables

Define the project roadmap, user adoption, content structure, requirements architecture, types of data needed to be captured, technology to use – the big picture, platform back-end, front-end and dashboard requirements.

1. System Requirements Definition

2. Information Architecture

3. Visual Design illustrating

  • a. full cycle; cause search, selection of desired cause, demography, geography, beneficiary agency, make donation ---- fast-forward, sample donor report
  • b. management dashboard

Timeline

Subject to project development.

Budget

Hourly rates to be quoted along with statement of experience on similar projects.

Main Issues to Address

  • The selection of the expert(s) will consider several factors related to relevance of experience, hourly rate, availability as well as the suggested approach to tackle the following key issues:
  • Vision for the platform to provided optimal transparency and accountability on both ends the donor and beneficiary in a manner that would encourage more and recurring giving as well as donation acceptance and better and more frequent reporting.
  • Recommended methods to maintain the highest levels of data security.
  • Best practice solutions to protect the platform uses form potential financial crime/ fraud.
  • Suggested secure payment gateways that avail zero liability on the platform for the transactions made.
  • Suggested ethical hacking requirements.
  • Optimal reporting and analytics tools enabling the platform uses and managers to enhance its performance and maximize on its overall outcome.
  • Good governance tools that maintain the independence of the platform from donors as well as beneficiary organizations. 
  • Suggested quality assurance methodologies and tools.
  • Suggested Mechanisms for vetting recipient organisations.

Please note that this project does not require implementation.  We would like that the person hired has the ability to come up with wireframes to show user interactions and page elements for the main pages to demonstrate the concept.  If you or your team does not possess visual design skills, that part of the job can be performed by another expert on Experfy.  Please be candid about your capabilities.

If you can share examples of revelant past work (it can be anonymized), it would be helpful.  We would like to see how you think about design and conceptual design.  Experience designing platforms and marketplaces should be highlighted.

Non-Profit
Strategic Business Planning
Dashboards & Scorecards

$100/hr - $175/hr

Starts May 01, 2017

9 Proposals Status: IN PROGRESS

Net 60

Client: P*******

Posted: Apr 14, 2017

Matching Providers