facebook-pixel

381 Projects that match your criteria

Sort by:

Voice Analytics to Predict Customer Behavior

Our Project:

Taking our recorded customer service phone calls and analyzing each one to determine sentiment to create a value that can be married to other attitudinal and behavior data points as a predictive measure of future behavior (persist as a Client, tenuous state, attrition etc.).

About TASC (Total Administrative Services Corporation):

TASC is a leader in the industry as the nation's largest privately-held TPA. TASC now offers more than 21 innovative products and services. Customers in all 50 states are served by the TASC team, which boasts 8,000 field representatives and over 900 associates at the Madison campus and remote locations. TASC works hard for tens of thousands of businesses, and last year the company's annual revenue exceeded $100 million.

Our Business Problem: 

For several years we have been collecting attitudinal survey information and have found that we are not able to accurately predict customer behavior using this data alone. We would like to bend the organization and the decision making of leadership to be data driven decisions not based on attitudinal alone but with more of an emphasis on behavioral data. To that end we are looking at whether or not the phone call data that we currently capture (every interaction) can be used to determine predictive behavior. For example, if a customer calls and is "upset" does that translate into a termination when x, y, and z is also present? Can we answer for the following:

We want to be able to know when the customer state of mind is tenuous before they even know

We want to be able to "read" people based on certain things before they even know how they feel"

Expertise Needed: Voice Analytics

Data Sources:

Each unique call is stored as a .WAV and has a unique ID with meta data including the assigned agent, duration, product line, customer ID etc. We also have customer demographics available in MySQL along with attitudinal data in .XLSX (separate files for each deployment but each have the customer ID attached to each record).

Deliverable: 

At minimum would be a database back with the call meta data and the 'sentiment rating' or another label of what analytics are able to be performed. Furthermore we could commission same entity to do the full analysis of the data set but primary objective is to get the quantifiable data from the voice records. (Would be an ongoing engagement on an X frequency - vs. a one and done type of project).

Financial Services
Professional Services
Natural Language Processing

$10,000 - $20,000

Starts Sep 01, 2016

17 Proposals Status: COMPLETED

Client: T****

Posted: Aug 11, 2016

SQL + Python/Java Data Engineer - 6 months Contract for Top Ten E-commerce Retailer

Onsite 2-3 weeks in Atlanta,GA and then remote.

Only candidates from USA. No sponsorship.

  • Design, construct, develop, test and maintain data with relationships across a heterogeneous data platform
  • Build high-performance algorithms, prototypes, predictive models and proof of concepts
  • Research opportunities for data acquisition and new uses for existing data
  • Employ a variety of languages and tools (e.g. scripting languages) to marry systems together
  • Recommend ways to improve data reliability, efficiency, and quality
  • Collaborate with modelers and other IT team members on project goals

 

Looking for a passionate engineer with 5-6 years programming experience with an analytical aptitude and a collaborative mindset for a fast paced e-commerce retailer. The person should be skilled in Java / Python and SQL is a must. Should also be familiar working with data scientists.

Consumer Goods and Retail
Java
Python

$90/hr - $100/hr

Starts Aug 22, 2016

6 Proposals Status: CLOSED

Client: E*******

Posted: Aug 11, 2016

Data Analyst with SAS and Medical Claims Experience On-Site in Waltham, MA

Fresenius Medical Care North America (FMCNA) is the premier health care company focused on providing the highest quality care to people with renal and other chronic conditions. Through its industry-leading network of dialysis facilities, outpatient cardiac and vascular labs, and urgent care centers, as well as the country’s largest practice of hospitalist and post-acute providers, Fresenius Medical Care provides coordinated health care services at pivotal care points for hundreds of thousands of chronically ill customers throughout the continent. As the world’s only fully vertically integrated renal company, it offers specialty pharmacy and laboratory services, and manufactures and distributes the most comprehensive line of dialysis equipment, disposable products, and renal pharmaceuticals.

PURPOSE & SCOPE:

Responsible for creating actionable claims-based analytics to support FMC’s partnerships with Accountable Care Organizations and other risk-based providers.

PRINCIPAL DUTIES AND RESPONSIBILITIES:

Perform analyses requested by FMC leadership of Medicare Claims (Part A, B, & D), sometimes combining these assets with other data sets.Automate creation of regular claims-based reports to support ongoing process improvement.Partner with FMC analytics leadership to generate new data inquiries and/or performance metrics to further inform clinical strategy.Maintain an ever expanding Medicare claims warehouse to support future analyses.Present key findings to members of FMC’s analytics and corporate leadership.Assist, if possible, with the development of claims-based predictive models.

EDUCATION:

Bachelor’s degree required, preferably in a relevant field (e.g., computer science, mathematics, engineering,biology). Some clinical training a plus.

EXPERIENCE & REQUIRED SKILLS:

  • Must have extensive experience using SAS to develop complex business reports
  • Must have at least a strong familiarity analyzing medical claims data
  • Experience in population health or clinical analytics preferred
  • Ability to collaborate with colleagues to create complex analytic work products
  • Detail-oriented, with strong organizational skills
  • Ability to work on multiple projects at the same time and meet deadlines

You MUST be able to work on-site in Waltham, MA and have the required work authorization. This is a six-month contract.

Healthcare
Claims Reporting
Managed Markets

$75/hr - $100/hr

4 Proposals Status: CLOSED

Client: F********* ******* **** ***** *******

Posted: Jul 27, 2016

Big Data Architect for Three-Month Assignment in Germany

We are one of the leading energy companies in Europe.  We have developed Virtual Power Plant (VPP) capabilities to manage electricity supply and demand to e.g. avoid frequency drops in the grid (fast frequency response) and automatically and flexibly manage generation assets like CHPs (read and write on generation assets).

In this process, a lot of data is collected: Operational data (real-time) as well as historical data. We currently do not exploit the value of the data collected (especially the historical one) and would like to explore patterns from the data, which will help us to further optimize our VPP offering.

Also, we developed an energy efficiency platform and analyze consumption records to offer alternatives and hence reduce energy costs. We would further like to help our B2B customers to optimize onsite electricity generation vs. grid.

In order to create more value from our data we are looking for data architectural skills. The person will have the following responsibilities:

  • Working development team focused on developing Energy Solutions based on the command and control of thousands of remote machines
  • Responsible for envisioning and executing on design & implementation of a variety of use cases based on the real-time data streams from live equipment
  • Work closely with development teams
  • Responsible for all deliverables associated with end to end data architecture, model, including source to target mappings and data dictionary
  • Ensuring quality, performance, and security of database design and implementation
  • Providing advice and support for projects regarding all aspects of database development including relational, NoSQL, and time series technologies
  • Assisting with the set-up of frameworks for the ECT database service to fit our micro services architecture
  • Supporting with the design and implementation of real-time data ingestion and processing using the Kafka, Spark, and Casandra technology stack
  • Setting up quality assurance processes for database development

We are looking for an experienced Big Data expert with the following qualifications:

  • University degree in Information Technology, Software Development, Engineering or related fields
  • Proficiency in database design, implementation, administration, and performance optimization
  • Experience with Non-Relational databases (such as like Hadoop, Casandra, Mongo, Cloudera)
  • Strategic and detail understanding of major database products (e.g. Oracle technology, MS SQL Server technology or open source systems such as Postgres and MySQL)
  • Knowledge in logical and physical database design and data modeling
  • Experience in developing systems from scratch - Experience of data warehouse design principles
  • Familiarity with configuration management and software release procedures
  • Appreciation of general technical concepts (e.g. Networking protocols, Windows and Linux server products, Back-up and recovery, Storage, IT Security)
  • Experience of and like working in distributed teams with various nationalities; Comfortable working in a fast-paced, dynamic, agile environment
  • Business fluent in English Ideally, we are looking for a long time engagement.

More info on our business:

http://www.eon-connecting-energies.com/en.html

Energy and Utility
Data Mining
Forecasting

$80/hr - $95/hr

3 Proposals Status: CLOSED

Client: E****

Posted: Jul 25, 2016

Economic Sentiment Indicator Algorithms: Predicting Brexit-like Events Using Big Data

Zurich Insurance is a global insurance company. Our Investment Management team has several challenges where Big Data and the underlying technology can play a role. For instance:

• Outlooks on macroeconomics and financial markets are currently developed based on market indices, market economic indicators and a selection of research papers. These outlooks are also incorporated in the tactical asset allocation for our own assets and therefore can potentially generate great financial benefits.

• We believe we can complement current information with additional “non-traditional” information on economic and financial developments consisting of both structured data (e.g. time series, research pools) as well as unstructured data such as social media, Google search requests, geo-data, etc. However the vast amount of data is simply too much for traditional methods.

• Brexit is a good example where we could back-test a possible solution. With Brexit all "white-collar" reports and research indicated Brexit would not occur. We therefore would like to test a solution that could ingest and process large amounts of unstructured data and determine an economic sentiment indicator. In this case applying cluster analysis to the data to find economic turning points pre-Brexit.

We would like to develop a big data system with a set of algorithms that would give us predictive capabilities on a macro-economic level.  We can share more details with the experts with the best proposals at a later stage in the hiring process.

Financial Services
Insurance
Analytics

$100/hr - $175/hr

17 Proposals Status: CLOSED

Client: Z****** *********

Posted: Jul 24, 2016

Determining Collection and Liquidation Rates through Debt Portfolio Analysis

Serengeti financial is part of a group of companies that we own that purchase charged-off debt portfolios and then do collections and collection litigation. Together, I refer to this group of companies as our Debt Management Business. I'm interested in getting Experfy's help in analyzing debt portfolios, projecting collection and liquidation rates and using this information to price debt portfolios and bid competitively. I have been following recent advances in data analytics,AI and supercomputing and would like to take full advantage of these technologies in our Debt Management Business.

For purposes of this project, my goals are to come up with a better way to analyze the data we are given on prospective debt portfolios and thereby come up with a better, more accurate, more effective purchasing strategy.  Ideally, we come up with a game changing technology and method to analyze debt portfolios.

One unique problem we face is we are given a prospective portfolio for purchase or bidding. But we are often given only a limited amount of information. Frequently we are given masked or partially masked files. And, we usually have a limited amount of time to evaluate the portfolio and make a bidding or purchase decision. That timeframe could be as little as two or three days and as long as two weeks.

I would like to come up with a solution to be able to quickly and effectively evaluate these portfolios. In addition, I'm interested as to whether there is a way to find and analyze additional information about the portfolio based on the limited information we are given. In other words, is there other data out there in the universe that's accessible that would be relevant to our portfolio

Finally, I'm not exactly sure of the deliverable and would rely on your advice and expertise. I was thinking an algorithm that would be easy for us to use but I'm sure you may have other and better ideas. Thank you very much for your assistance.

Financial Services
Legal
Professional Services

$100/hr - $175/hr

Starts Aug 01, 2016

21 Proposals Status: CLOSED

Client: S********* *********

Posted: Jul 13, 2016

Hadoop Architect with strong HBase, Storm, Phoenix, Spark experience for a Top Ten E-commerce Retailer

We are one of the top 10 e-commerce retailers in the world and are looking for Hadoop Architect with strong HBase, Storm, Phoenix, Spark experience

Job Description:

1. Assist in architecture and implentation review of HDB ingesting data at scale (~2-3 TB/Day)

Data is ingested from site servers

JVM --> Kafka --> Storm --> Hbase / .ORC --> Spark SQL

JVM --> Kafka --> Storm --> Hbase --> Phoenix SQL

Assist with Spark Streaming and Spark SQL guidance and development tasks as per use case above

Perform architecture review of existing HDP solution and provide written Power Point report on findings and recommendations.

2. Assist Customer with HDP configurations to address identified findings, focusing on the following:

a. Kafka

b. Storm

c. HBase / ORC

d. HDP architecture scaling for increased ingested data volume and processing

e. Phoenix / Spark SQL

3. Assist to perform an end to end data ingestion flow from servers’ JVM, through Kafka, Storm and HBaseThis is a two-week project, involves working remotely and would need to start as soon as possible.

This is a two-week project, involves working remotely and would need to start as soon as possible.

Consumer Goods and Retail
Apache HBase
Apache Hadoop

$125/hr - $250/hr

Starts Jul 18, 2016

12 Proposals Status: CLOSED

Net 30

Client: M***

Posted: Jul 09, 2016

Data Warehouse Architect to diagram and enumerate SHYFT environment (Phase 1)

We are an acquisitive Pharma company (no R&D) and as such we acquire companies that often have disparate data platforms. We are trying to build two sovereign data platforms (one for Commercial Sales Division and Corporate and the other platform for Consumer Sales and Lab). We would also like to create an MDM structure which bridges the two with ETL processes on either side to protect data flow. In the short term, we need to fully understand the architecture and design of our current analytics platform for our Commercial Sales/Corporate world.

Phase 1 requires that we fully understand the architecture and design of our Shyft environment as it relates to Commercial Analytics, contracts, reimbursement, etc.  It was designed and built by prior personnel and no Design Docs were issued.  The role will fully enumerate, document and present on the architecture of the Shyft environment, working with the Commercial Analytics team SME's.  

The role will then, in tandem work with the Commercial Analytics team SME's, present a proposal on a design that meets the current and future needs of the business including, but not limited to: Commercial contract reimbursement, reporting, Tableau connectivity and related functions.

This outcome will allow us to create a new process structure for contract and reimbursement data flow into the Shyft environment to be used to provide a single source of truth for this data.

Shyft
Pricing and Reimbursment Analysis
Managed Markets

$5,000 - $20,000

Starts Jul 05, 2016

9 Proposals Status: CLOSED

Client: A**** **************** ****

Posted: Jun 30, 2016

Sales Prediction Algorithm for an Apparel Retailer

Everester is a mid-sized Apparel retailer. They manufacture and sell a variety of apparel for women, kids, and men. They have stores mostly in the USA. We have the sales history data for sales at stores for the past 3 years. We also know the store locations to understand where these stores are located and how this might influence the sales at these stores. We need to predict the sales per day for each of the next 100 days. We like to have a data science program that can also look at adjacent factors that influence sale including location and weather. Having an effective sales prediction will allow the store to figure out when to advertise more vs. when to advertise less. Plus it also will help them plan the inventory and staffing levels at a store.

We want you to create a general purpose data science module in python or R that can 

- Digest the daily sales data for the past 3 years and develop a model.

- Identify and use any publicly available sources of data for influencers on sale.

- Produce a prediction model that is cheap to run and provides higher accuracy of sale (with 3% of tolerance).

- Use the model to produce predicted sale at the stores for next 100 days. 

- The model should run with minimal deployment steps by an engineer at Transform. 

- Stretch Area:  Recommend what days are good to do promotions and what days are not good to do promotions to invite more customers.

As part of the deliverable, we require a 2-4 page writeup of your findings, besides the code module.

Data Set: The data set and problem are proprietary and confidential. Please do not share this with others.

The data will be shared once we get through the initial problem acceptance phase. We will start with data for 3 stores. There will be about 2500 rows of data for 3 stores, provided as a CSV file of size of 200KB. And we expect to have Transform run the model for additional stores to validate the model. 

And here is a sample data file for the format

# Daily Sales Data

# Store: TB1 = Tacoma, WA; PO1 = Portland, OR; DE1 = Denver, CO

Date,RevenueUSD

4/24/16, TB1, 40183

4/24/16, DE1, 18849

4/24/16, PO1, 33413

4/23/16,TB1, 73952

4/23/16,PO1, 43945

4/23/16,DE1, 63149

Consumer Goods and Retail
Machine Learning
Predictive Modeling

$75/hr - $150/hr

Starts Jul 06, 2016

15 Proposals Status: COMPLETED

Client: T*********

Posted: Jun 28, 2016

Looking for TAM Attributes for Various Categories Regarding Analytics and Big Data

Jumpshot is a marketing analytics company that helps marketers understand their customer’s entire online lives. From the key sources of traffic to a site, to the browsing, consuming, and buying behavior on a site, to where customers go once they’ve left a site, our platform reveal the entire customer journey. Jumpshot tracks more than 160 billion monthly clicks from our 100-million customer panel's clickstream activity. In short, we are able to see every single click that our user panel performs in the order that they do them from January 2014 through yesterday. 

I am trying to ascertain the global Total Addressable Market (TAM) in numerous verticals to get a better understanding of the opportunity for my company to expand moving forward. The verticals of interest are below and their future anticipated spend on analytics and big data.

We are trying to understand the TAM for our company so we can better project the opportunity and size of our business, including choosing the proper specific verticals and industries to target.

Categories

- Marketing analytics (meaning brand direct spend in verticals such as Travel, eCommerce, and Retail)

- Ad Tech (platforms, DMP, DSP, Customer Segmentation, etc)

- Institutional investor (hedge funds, banks, investment firms, etc)

- Market research

- Pharmaceutical/Biotech analytics

- Healthcare analytics

- Consumer Package Goods

- Media/Publishing

Entertainment (movie, tv, streaming music/video)

 

Category attributes

- Total size

- Growth rate projections for next 5 years

- Competitive set

- Geographic focus where applicable (US, UK, and International)

- Top 5 Vendors in each category/vertical selling data into the analytics space

The kind of expertise you require:

Market research expertise who can assess TAM for multiple industries, including various specific attributes such as total size, growth rate projections 5 years out, competitive set, and top geographic areas.

What is the deliverable?

Word doc, Google doc, or Powerpoint slides

 

Consumer Goods and Retail
Financial Services
Media and Advertising

$2,500 - $4,500

Starts Jul 12, 2016

4 Proposals Status: COMPLETED

Client: J********

Posted: Jun 23, 2016

Matching Providers