facebook-pixel

381 Projects that match your criteria

Sort by:

Correlation Dynamics

  • Our business is optimal trade execution in US cash equities. We wish to improve models of factor dynamics and correlation dynamics relevant in risk and fair price models.
  • Background: stock returns are commonly modeled as linear regressions on factor returns; the coefficients reflect the correlation structure over the training dataset. Unfortunately the correlation structure changes - what is useful is not the past correlation structure but the future one. ADCC models use a GARCH-like approach to model correlation dynamics, but we are concerned they may fail to adequately capture the context-dependenxy of the correlation structure.
  • The basic idea here is to develop an alternative way to model factor dynamics, that doesn't use daily returns but instead models the market as a Poset of "events". Each event is a significant change in one of the factors over a period of time from t1 to t2. We provide a simple algorithm to identify events, and provide a dataset of daily prices and volumes.
  • The first stage of the project involves the following steps. This is a very deep problem and if the first stage is successful more will follow.
  • (1) compute daily returns (close to close) using the data provided
  • (2) Build the daily returns correlation matrix of 57 factors (see attachment) --> report #1
  • (3) Build Lasso regressions to explain daily returns of each factor as a function of a subset of the others. Here the choice of Lasso is intended to provide natural feature selection; other feature selection methods may be suggested by the researcher.
  • (4) Use the provided algorithm to construct the events history for each factor. Each event comprises an event type ("rise", "fall" or "equilibrium"), a start and end time, and 57 factor returns.
  • (5) Build correlation matrix of event factor returns, and test vs the null hypothesis that it is the same as the daily returns correlation matrix --> report #2
  • (6) Build Lasso regressions for event factor returns, use these to produce estimates and compute the R^2. Then compute the R^2 using the daily returns regressions saved from point (3) above --> report #3
  • (7) Construct a chronology of the market activity through a sequence of non-overlapping "leading events". A leading event is an event for which the % unexpected factor return is greater than for all other events with overlapping start/end times. A simple algorithm will be provided to derive this from the events history --> report #4 
  • The utility of step 7 will become apparent when we state the next phase of the project.
  • If successful, the next stage of the project will address conditional correlations. Stage 3 will establish whether the event representation enables better forward correlation predictions than an ADCC model.
  • The data consists in price and volume data, daily from 2008 to Oct. 2013. Subsequent dates are reserved for OOS testing
  • We use mainly R and Mathematica here
  • The deliverables will be the mentioned reports, enriched data tables produced, and the corresponding code
  • Data shared for the duration of the project may not be used for any other purpose and must be destroyed upon termination. The project itself will remain confidential unless otherwise stated, the reports and all documentation related to the project will be considered confidential and must be destroyed upon termination
  • Sample data, an Excel example of the events history logic and the data spec are attached below
Financial Services
Economic Modeling
Price Level Indications

$2,500 - $5,000

Starts Oct 28, 2015

2 Proposals Status: COMPLETED

Client: P******** ***

Posted: Oct 09, 2015

Data Scientist

JetSmarter is evolving the way people fly. By seamlessly connecting flyers to aircraft through our apps, we make private aircraft more accessible, opening up more possibilities for flyers and more business for air carriers. We're a game-changing startup, providing quality, on-demand air charter requests from the touch of your phone. A tenacity toward delivering an awesome experience for our users is critical to our success, and we're looking for a Director of Member Services to help bolster this.  Due to the companies rapid expansion there is much opportunity for growth within the organization.  This candidate will be integral in improving logistical efficiency,streamlining processes, and creating unique methods to increase conversions by looking at historical, current, and projected data. 

Aviation began as a private luxury and since then the industry has become burdensome with layovers, lines, and luggage fees.  We are reinventing the way you travel.

Qualifications:

  • 2+ Years of Experience as Data Scientist.
  • Experience with production code, visualizing data, and machine learning desirable.
  • Doctorate Degree desirable.
  • Masters in Mathematics, Computer Science, or related field applicable.
  • High attention to detail.
  • Working knowledge of but not limited to: Natural Language Processing, Conceptual Modeling, Predictive Modeling, and Hypothesis testing.
  • Excellent communication skills.

Responsibilities:

  • Bring not just an analytics-orientation, but the ability to use analytics to drive key success metrics related to cost reduction and revenue generation.
  • Work with others to develop, refine and scale data management and analytics procedures, systems, workflows, best practices and other issues.
  • Demonstrating analytical and quantitative skills with the ability to build innovativecomplex models to illustrate various scenarios of logistical flows to compare KPIs.
  • To be successful in this role, you should be comfortable executing with little oversight and be able to adapt to problems quickly.
  • Experience with statistical analysis, multi-variant testing strategies and concepts, regression modeling and forecasting, time series analysis, data mining, financial analysis, economic order quantity modeling, game theory and customer/product segmentation.
  • Building prototype optimization/statistical models for improving business, product, and engineering operations.
Aerospace
Hi-Tech
Hospitality, Travel and Leisure

$100/hr - $200/hr

Starts Nov 01, 2015

20 Proposals Status: CLOSED

Client: J**********

Posted: Oct 06, 2015

Predicting the Best Customers and Lead Scoring for a Mailing Optimization Company

We are a mailing optimization company that helps large brands analyze their opportunities to mail more efficiently and at a lower cost. The purpose of this project is to determine the most likely candidates to buy and focus our sales and marketing investment and efforts towards the most profitable prospects and customers. The objective is to create a scaleable reporting and analytics system that enables us to produce daily reports. We want to have access to the code so that we can update it as we are get new data scources.

Analytics and Dashboard

This phase involves building a lead scoring algorithm that predicts our best customers. This will include analytics on thousands of contacts in SQL data and CRM data considering the factors as prebuying activities like white paper downloads, e-mail open rate and frequency, webinar sign up and attendances, interaction with sales person, attended a demo session, company name, attibutes, zip code, Linkedin profile data, mail volume, employee count, prior sales data and related comapany sales data. We want the dashboard to tell us which are the 50 top prospects our sales people should be expending their energies on at the given moment.

Sales
Dashboards
Reporting

$12,000 - $15,000

Starts Nov 10, 2015

19 Proposals Status: COMPLETED

Client: W****** ***** ***

Posted: Sep 30, 2015

Tableau Architect / Developer

This is posted on behalf of the client. 

Tableau Architect/Developer

Location: Princeton, NJ

Client: Pharma

Positions: 1

Duration: 6 Months Total. 1st three weeks are on site. Remaining time will require 1 day a month on-site.

Travel Expenses will be paid for on-site work.  

Please apply only if you can work on-site for the 1st three weeks of project.  

 

Job Description

 

  • Work with Tableau, creating visuals, building dashboards and customizing to needs of business
  • Use custom SQL for complex data pulls
  • Drive insight by designing visualizations with logical and meaningful data flow
  • Be analytical with excellent problem solving skills and see patterns
  • Meet with key stakeholders and developing design requirements based on discussion
  • Push the limits of software, be a problem solver
  • Start from scratch with raw data set and build key insightful visualizations without direction from client or stakeholders

 

Qualifications

 

  • 5+ years working with Tableau, creating visuals, building dashboards, and customization to meet needs of end-user
  • Tableau 8.0 and 8.1 experience
  • Experience with designing complex dashboards taking advantage of all tableau functions including data blending, actions, parameters, etc.
  • Experience with connecting to multiple data sources including Teradata, SAP BW, Oracle, SQL, Hadoop and others
  • Experience integrating Tableau into various external environments (such as websites, sharepoint, etc.)
  • Familiarity with Tableau’s Javascript API
  • Good knowledge of tableau server, administrative functions, installation (a plus)

Pharmaceutical and Life Sciences
Tableau
Business Intelligence and Visualization

$80/hr - $100/hr

10 Proposals Status: CLOSED

Client: E*******

Posted: Sep 30, 2015

Predictive Analysis for Chemical Delivery Frequencies and Demand

We are a chemical distribution company focusing on the aquatic industry across the Gulf Coast and are looking to utilize our data to predict the most optimal delivery dates for our product. 55% of our sales come from customers that pay us a flat rate and in return we provide whatever quantity of chemicals is necessary.  Therefore we are in a position to determine when the best time to deliver would be (we aim to deliver when 80% of the product has been consumed).

Our customer groups include water parks, apartments, hotels, water treatment plants etc. Currently we have ~5500 active delivery points and would like to potentially use the historical delivery data, customer information, and outside factors (ex: weather) to predict when the next time a customer will need a delivery.  We have up to 4 years delivery data in our database (NetSuite) that can be exported to any type of file or integrated in many ways.

Currently we use a very basic r-language model that is only looking at historical data and is not extremely accurate so is more used to identify accounts that need to be looked into further.  We utilize cloud applications across our organization therefore the ideal solution will follow this strategy.

An example of data includes: delivery amount and date, pool volume, account type (apartment, hotel, etc), location, Apartment specific information including Occupancy, number of units).

In your proposal please provide total amount of hours you will require to complete project.

Thanks!

Chemical, Oil and Gas
Transportation and Warehousing
Inventory Management

$65/hr - $120/hr

Starts Nov 09, 2015

28 Proposals Status: COMPLETED

Client: P********

Posted: Sep 25, 2015

Big Data Architecture of Realtime System (Storm)

We are a proximity marketing company. We have a network of beacons installed in hundereds of locations including malls and resturants. Our beacons currently see of 25 million users per month. We need help in designing our big data architecture and possibly implementing it as well.

Overview

In order to offer a realtime solution about the events data generated by the Mobile API component, we need to define an architecture to deliver information about the user devices interacting with our system. In this process we'll receive events from devices and we need to complete this events with more information.

The topology is composed by the following components:

  • Kafka spout: This component receives data (events) from the backend component via a kafka queue and emits them to the rest of the topology.
  • Validation bolt: It receives events from the Kafka spout and validates whether the event is compliant wit the JSON schema defined. If compliant it emits the event to the rest of the topology and inserts it into the events MongoDB collection. If not compliant, it does not emit the event and stores it into the errors MongoDB collection.
  • IDFA Bolt: Given an event emitted from the Validation bolt, with its corresponding IDFA, this bolt stores info associated with this IDFA into the CIIM MongoDB collection
  • S3 Bolt: It receives events from the Validation bolt, converts them into tuples (using some fields of the JSON document) and stores them into S3 to be processed later by EMR.

This Storm topology, has been implemented using Python and StreamParse. More information about the project is attached.

Sources of Data

  1. Data coming from mobile devices
  2. Data on beacon locations
  3. Campaign data
  4. API reporting usage data against which we are billing

We currently have two MongoDB instances in production, interacting with Storm, S3 and Hadoop.

We are looking for an architect to critique our current plans to help build a highly scalable system.  We are looking for short-term fixes to our the current system and also long-term architecture that will enable us to scale as we increase the number of beacons that we have deployed.

In your proposal, please provide 1) previous work that you have done that is relevant; 2) how you would approach this architecture exercise; and 3) estimated hours and budget.

Hi-Tech
Media and Advertising
Data Center

$100/hr - $200/hr

Starts Sep 21, 2015

14 Proposals Status: IN PROGRESS

Net 30

Client: M********* ******** ***

Posted: Sep 16, 2015

Financial Services Ontologist

We are working with a leading wealth management firm to enable an enterprise search capability that will change the way financial advisors and other personnel and clients interact with data throughout the organization.  Building an ontology is a key enabler of this strategy.  We are looking for an individual who has developed and implemented ontolgies within a financial services context.

Financial Services
Self Servicing Semantic Data Layer
Data Management

$80/hr - $125/hr

Starts Sep 28, 2015

11 Proposals Status: CLOSED

Client: K***********

Posted: Sep 14, 2015

Senior Marketing Analytics Specialist

We are a premier full-service agency that blends food, culture and marketing expertise to deliver powerful integrated ideas that feed our clients' success.  In this part-time role, we are looking for a passionate analyst to turn data into information, information into insight and insight into business decisions. The Analyst position is dedicated to driving meaningful and actionable analysis on integrated advertising, social media and CRM campaigns with a strong focus on agency tools and resources to execute effectively. This position will work with the media and account service team to develop all analytical deliverables for assigned clients, including back-end reporting, testing frameworks, test results read-outs and ongoing client learning plans. Additionally, the Analyst works with the provided resources to create customer/business insights that facilitate the development of a marketing strategy designed to positively impact our client’s business.

Responsibilities:

  • Responsible for day-to-day analytics deliverables for assigned clients, including QA of data and any scheduled reporting
  • Develops close working relationship with Client Services teams to provide program reporting, campaign recommendations, as well as thought leadership that is in alignment with the client’s business goals and strategy
  • Analyzes campaign and customer data to provide conclusions, implications and actionable recommendations designed to improve the quality of future marketing communications
  • Demonstrates knowledge and proficiency in the utilization of analytical tools for advertising, social media, CPG marketing, general and digital media
  • Identify, analyze, and interpret trends or patterns in complex data sets
  • Locate and define new process improvement opportunities
  • Acquire data from primary or secondary data sources and maintain databases/data systems
  • Identify & analyze custom data for segmentation & profiling
  • Develop and maintain results dashboards.

Qualifications:

  • BA/BS required.
  • Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy
  • 3-5 years’ experience in analytics focusing on marketing campaign effectiveness reporting, analytics, and development of customer knowledge in a fast-paced client environment
  • Experience working in marketing or media analytics preferred
  • Working knowledge of basic analytic tools, with advanced experience in data visualization tools such as Tableau.
  • Working knowledge of the utilization of analytical techniques for the measurement of online advertising, CRM and web environments a plus
  • Demonstrates ability to multi-task in a fast-paced environment
  • Possesses exceptional written and verbal communication skills
  • Motivated, self-starter
  • CPG marketing & food experience a plus
  • Familiarity with data providers such as Nielsen, IRI, Ad views, MRI, etc.
  • Familiarity with a breadth of ad technology a plus: Ad Servers (DoubleClick, Atlas, MediaMind, etc.), Search Engine Optimizers (Kenshoo, Marin, etc.), Web Analytics (Omniture, Google Analytics, etc.), Demand Side Platforms (Turn, MediaMath, etc.), Data Management Platforms (Turn, BlueKai, etc.), and Attribution (ClearSaleing, Adometry, etc.) Social media listening tools such as (Sendible, Crimson Hexagon) etc.
Professional Services
Customer Analytics
Media and Advertising

$75/hr - $125/hr

Starts Oct 15, 2015

14 Proposals Status: COMPLETED

Client: T*** **** *****

Posted: Sep 14, 2015

Design and build a database and data input web application for a private investment firm

Design and build a database schema and an associated web-based data entry application to support investment tracking at a private investment firm. We are a private investment firm that invests in early stage companies.

At a high level, our private investment firm pools money from investors known as limited partners into vehicles or funds. These funds then invest a certain amount of capital in investment rounds of companies under various terms. Other investors may also invest alongside us in the investment round or at other times. The companies submit financial and other operating metrics to the funds from time to time.

The database we're trying to build needs to house all the data related to our investors, funds, companies, investments, and company performance.

High level requirements are:

  • Design a database schema and physical model based on the attached conceptual model. (The list of attributes required for each entity will be provided.)
  • Create and deploy the database in MySQL on AWS or other cloud infrastructure.
  • Design, build and deploy (on cloud) a web-based UI for users to input the data into the database. The UI needs to have simple data input, edit and delete functionality.
  • Data visualizations are not in scope for this project, although the database design should be in such a way that Tableau can pull the data. To that end, denormalized views may need to be created depending on the physical model.

In your proposal, please be very specific about 1) your approach; 2) the technology stack that you would use; 3) milestones and dates; 4) Samples of previous related work you have completed.

We will accept proposals only from United States with preference given to those on the East Coast.

Financial Services
UI Design
Web Programming

$20,000 - $30,000

Starts Sep 21, 2015

10 Proposals Status: CLOSED

Client: P******* ********** ****

Posted: Sep 11, 2015

Matching Algorithm for Executive Mastermind, Iteration 2

We seek to build the second iteration of the algorithm mentioned here: https://www.experfy.com/projects/50-matching-algorithm-for-high-profile-executive-mastermind-group

This algorithm will update and improve upon last years algorithm by: adding new member data, adding newly collected data for members (by helping us survey and collect relevant information from members), and clustering the Mastermind group into groups of 8 people.

Data Visualization
Data Analysis and AI Tools
Hi-Tech

$2,500

Starts Sep 03, 2015

1 Proposal Status: COMPLETED

Client: P*** ********

Posted: Sep 03, 2015

Matching Providers