facebook-pixel

381 Projects that match your criteria

Sort by:

Algorithm for Generating Manufacturing Times from CAD Models

The goal of this project is to use a data set of 3D CAD models (input) correlated to known manufacturing times for milling (output) so to generate an algorithm(s) to predict milling manufacturing times from 3D CAD models.  This is an exploratory project to determine whether the inputs currently available can provide algorithmic outputs that are accurate within +/- 5% of known outputs.

We are looking for out of the box thinking to develop these alogrithms.  We are are dealing with 3D objects and there may be unconventional approaches that may help us accurately predict the Run Time so that it can be used to balance the Capacity Loading on our shop floor.  This Run Time would serve as a workload trigger to accurately manage committed lead times.

Data Inputs

Option A: 3D CAD Model: available in a variety of formats (see Appendix A)

Option B: Option A plus Variables (see Appendix B) readily extracted from 3D CAD model (ie XYZ extents, volume, area)

Option C: Option B plus Variables (see Appendix C) extracted 3D CAD model from Rapid developed proprietary tools

Data Output (Results)

Provided in Data Sample:

1. Total Run Time

  • Operation 1 Run Time
  • Operation 2 Run Time

2. Desired Data Output

  • Total Run Time per Part

3. Even More Useful Data Output

  • Separate Run Times for Operation 1 and Operation 2

4. All parts provided will only have two Operations

Our Current Software & Platforms

  • C# is our main Code Platform with Visual Studio 2013
  • SQL DataBase
  • WCF Services
  • MVC5 Web Sites
  • AWS

Output Format

  • If possible, we would like a R .dll with a C# wrapper that we can use in our code for each process run (If this is not in your skill-set, then we can have someone else do this--your job will be to provide R code).
  • Total Time for each part
  • Extra credit: Time for Operation 1 and Operation 2 for each part

Questions

  • Is the project feasible?
  • Do the Data Inputs provide enough information?
  • Is the desired Output reasonable? What sort of accuracy range can be expected?

Appendices (see attached PowerPoint file)

  • CAD Model Formats
  • Variables 3D Model
  • Variables Company

NB: An Excel file is attached with part run times and definitions and a zip file contains 3D models that are linked to the "Part URL" column in the Excel file.

Please clearly specify the following in your proposal:

  1. Your understanding of the problem and your questions, if any.
  2. How would you tackle this problem--what kind of techniques would be effective and why?
  3. How much time it would take to build a proof of concept and how many parts would you include in your proof of concept?
  4. What is your budget of the proof of concept?
Manufacturing
Computer Vision
Image Analysis

$75/hr - $175/hr

8 Proposals Status: CLOSED

Client: R*****

Posted: Aug 11, 2015

Optimal Sales Call Time(s)

We're a entrepreneurial coaching service targeted at successful business owners looking to increase their profitability and efforts while minimizing the amount of time needed to grow their company.

We are trying to determine what the best time of day is for our sales team to call prospects. We need the results broken down by time zone.

Our current database resides in Salesforce; we can provide a CSV format export if required.

Sales
Data Profiling
Data Management

$75/hr - $150/hr

Starts Sep 16, 2015

20 Proposals Status: COMPLETED

Client: S********* ***** ****

Posted: Aug 11, 2015

Product Sampling - Data/Measurement/CRM/Predictive Analytics

We are an agency holding company consisting of six subsidiaries, each with their own areas of expertise. For this project, we will be focusing on our Lifestyle Sampling & Fulfillment subsidiary.

This subsidiary distributes a high volume of Consumer Packaged Goods (CPG) Product Samples to a variety of venues (hotels, spas, colleges, events, trade shows, etc.), and also handles warehousing, storage, and overwrapping of the CPG Products prior to distribution. 

Background/Problem: There is currently no good measurement tool in existence that measures Product Sampling Effectiveness.

The marketing mix models (MMM’s) were built for marketers to input the specific details of their marketing spend to determine Return On Investment (ROI) at the market and tactic level. These models do not include Product Sampling in the mix, due to lack of volume (have to reach 1% of the population with any marketing tactic in order for it to be measurable within the models). These models were built many years ago to account for traditional media, including TV advertising, radio, print, out of home media, direct mail, in-store media, social/digital media, etc., commissioned by advertising agencies that do not traditionally control the product sampling spend, and therefore they self-servingly omitted measuring tactics, like product sampling, that do not drive revenue into their agency. Therefore, marketers that had historically spent billions of dollars on Product Sampling every year have begun to greatly reduce or all together abandon sampling as a tactic, because they are unable to quantify sampling’s short or long-term impact.

Our previous strategy in using market research have utilized Attitudinal research methodology, whereby we would interview consumers after they received a sample in order to determine their previous purchase habits, and their future purchase intent, using a test/control methodology to determine incrementality. The limitations of this methodology (and other similar methodologies) are many, including the inability to correlate actual purchase with claimed purchase, and the lack of any direct link to in-store sales purchase verification, causing many marketers to believe that it is impossible to measure out-of-store sampling effectiveness. We would like to solve this problem, as we believe doing so will not only help us demonstrate the power of this tactic and ultimately result in more manufacturers embracing product sampling and specifically OUR product sampling solutions, but we can also offer this measurement solution to the manufacturers as a SAAS platform to evaluate their proprietary sampling programs, and license to other sampling vendors/suppliers to empower them to demonstrate the effectiveness of their sampling vehicles. 

As some added information, below are a few of the other current (rudimentary) methods being used to measure sampling effectiveness: 

1)    Provide a sampling campaign specific coupon along with the samples, which has a unique barcode, and can therefore be tracked and measured (the main problems with this tactic are that: (a) it does not speak to retention; and (b) many times people will not clip and save and bring the coupon with them to the store to make the purchase).  The goal of a sampling program is to induce trial that leads to purchase conversion.  Evaluating sampling effectiveness by adding the additional hurdle of sampling to people who are both inclined to use coupons and who follow through and redeem the coupon at purchase eliminates the inclusion of all purchase that occurred without a coupon and negatively impacts perceived ROI. 

2)    Determine a baseline of sales prior to sampling and then measure sales a few weeks/months post sampling to determine the increase. The main problems with this tactic are that there are so many other variables that could be effecting the baseline, it is hard to attribute it directly to sampling.  Sample distribution can often occur over a wide geographic area, and sometimes over a long period of time (weeks or months).  In addition, purchase cycles impact the timing of purchases (if you just bought a bottle of shampoo, it could be 2 months before you buy another one) and purchases can happen across hundreds of retailers, making it impossible to see any meaningful lift at retail. 

Project: Create a model that will allow manufacturers, retailers and marketing services suppliers to accurately measure, post-program and/or predict pre-program, the effectiveness of product samples distributed via a variety of different methods, including through venues, via direct mail, via on-line request, and in-store. 

This will allow specific campaigns to be measured, but also ultimately power a predictive model companies will be able to use in order to plan their sampling campaigns based on projected ROI. Cross-tab segmentation might evaluate effectiveness of subgroups based on things like demographics, geographics and psychographics.  We need to be sure to attempt to predict both: (a) increase in sales in the short term; and (b) the long-term impact of those sales. The probability of retention (multiple purchases and lifetime value of the conversion) is an important factor.

In addition to gathering quantities of market research findings and developing a measurement tool, we desire the ability to re-engage participating consumers via opt-in marketing campaigns for ongoing CRM. The database will be populated in two ways: 1) product sampling recipients who participate (both those who do and don’t participate in the research) might be incentivized to participate and/or opt-in by offering them access to future sampling campaigns and/or sweepstakes opportunities; and 2) call to action messages prompting consumers to sign-up will be delivered with the +75 million product samples BC distributes on behalf of our clients annually. The long-term vision would be to have a database whereby we can: 1) conduct research studies and gather insights; 2) execute highly targeted campaigns for marketers; and 3) own a consumer-facing website/app where people register to receive samples and special offers. 

Expertise Required: Market Research best practices/Data Capture/CRM/Predictive Modeling

Data Sources At Our Disposal: We will be able to provide what product we are sampling, what venues the sampling is being distributed in and the quantity of the samples. Other than that, we will need to gather any other information we deem necessary to complete out goals as the technology is rolled out. 

Deliverable: We want to create and own a platform that has Data Capture Technology capabilities, with the end goal of capturing enough data to develop an algorithm that we can plug: (a) product; (b) spend; and (c) network into in order to predict the short term sales increase as well as long term retention of such consumers. Lastly, the technology will have to have the capabilities to function as a CRM solution as well.

Please provide the amount of hours required to complete project in your proposal. 

Media and Advertising
Market Research
Customer Analytics

$100/hr - $200/hr

Starts Sep 15, 2015

15 Proposals Status: CLOSED

Client: B***** ************ ***

Posted: Aug 10, 2015

Trading Algorithm for stock trading

We are a start-up quantitative trading company using automated algorithms to trade stocks and futures. We have an existing intra-day strategy which needs machine learning experts to refine and improve. the task will initially be classifying trending and non-trending periods during a particular trading day. Another task will be predicting short term price movements based on the order book sizes.

Skills we are looking for: Machine learning, predictive analytics, SVM, knowledge in financial markets, prior experience in working for finance-related projects

The project can be a on-going project if the intial results are satisfactory.

Machine Learning
Data Mining Algorithms
Support Vector Machines

$75/hr - $175/hr

Starts Aug 10, 2015

14 Proposals Status: COMPLETED

Client: A****** ****

Posted: Aug 06, 2015

Find Trends in Changes to Real Estate Values in Toronto

We sell real estate, specifically residential houses and apartments. We have access to data of 95% of all real estate transactions. Including Addresses, number of bedrooms, etc. We would like to mine this data and in conjunction with other data (such as demographic, employment, income growth, proximity to certain amenities, zoning changes, school districts, walkscore, nearby developments) to determine what are the biggest determing factors for real estate price change. 

Goals for this project

Minimum (content for article writing)

1. supply valuable content to educate buyers and sellers in the Toronto real estate market,

2. cite interesting case studies on valuation changes

Target (web application)

3. make predictions on which is the best neighbourhood to be in based on a psychographic profile. 

Outrageous (advisory service)

4. make predictions on which streets people are most likely going to demand or streets that have a high likelihood of thinking of selling their home

Datasources

* Demographic info http://www1.toronto.ca/wps/portal/contentonly?vgnextoid=2394fe17e5648410VgnVCM10000071d60f89RCRD

* data on housing sales (location, price, housing type, bedrooms, baths, land size, dates) can be exported with a csv file

* addresses of homes where building permits where applied for in a .csv format

* Current zoning http://map.toronto.ca/maps/map.jsp?app=ZBL_CONSULT

* Walkscore

Real Estate
Market Research
R&D

$2,500 - $15,000

Starts Sep 01, 2015

17 Proposals Status: CLOSED

Client: R*****

Posted: Aug 03, 2015

Data Scientist at Auto Lending Platform (Fintech)

The Company

We are a technology and data-driven auto lending platform that has originated over $100M of auto loans to date.  We use technology & analytics to originate, underwrite, & service profitable niche segments of the vast auto lending market.  We have a fantastic team of 40 people, and are based in Greater Boston, MA.

The Project 

Our stack is currently built predominantly in PHP and uses MySQL.  We are looking for a highly experienced individual in programming (dynamically and statically typed languages), database (relational, object-oriented, and time series), and statistical analysis (eg. R, Python) to do a deep dive into our data structure to provide feedback on how to improve the performance of what we currently have and also recommend a strategy for our future growth.

The Candidate

- Experience managing technical teams, both in-house and outsourced
- Working knowledge of Source Code Management tools e.g., git

- Mastery of software engineering processes and tools such as continuous integration, unit testing, code reviews, IDEs, etc.
- Expertise in both statically and dynamically-typed programming languages e.g., Java, JavaScript, and PHP
- A solid understanding of different types of databases: relational, object-oriented, and time series databases

- Experience with data analytics pipelines e.g., Hadoop, Spark

- Experience with statistical modeling e.g., R, Python
- An undergraduate degree in Computer Science or a related field from a top university

- Machine learning experience    

Automotive
Financial Services
Electrical/Electronics

$75/hr - $105/hr

Starts Jul 27, 2015

15 Proposals Status: CLOSED

Client: F***** **** ********* ***

Posted: Jul 29, 2015

Daily Fantasy Sports algorithms

Creating an advanced analytics and data platform to support the growing fantasy sports (and daily fantasy sports) industry. Combining traditional stock analysis methodologies with simple, user-friendly UX/UI to power smart decision making for the casual (and not-so-casual) sports fan.

The ability to create an algorithm that increases someone's chance to succeed with more frequency in daily fantasy sports. My first area of focus is on Major League Baseball because it's currently in season and I have a strong passion for the game. My parents named me after a Major League Baseball player.  That being said I also enjoy PGA Golf, NBA Basketball and NFL Football as well. I would want to create an algorithm for those players as well. I want to use this for my own personal use as well as offer as part of Daily Fantasy Sports Service to existing players. The more success I create the more credibility the service I market other players will have.  

I am a Daily Fantasy Sports Player with a little above average skill of the game. I believe with the assistance of your experts and the data that I see that is consistent with the players that are consistently winning the daily games. (most are professionals with a math background, engineers, poker players, stock brokers.) They are using specific strategies. I know we can create even better ones. 

I read a study that was conducted by Stanford University from 2012 stating that big data could enhance by as much as 20% your chances of winning. I know what I am asking for is possible. I am looking for the expert that assist me in making it a reality. I have articles and data I can forward once a connection with an expert is made. If the expert has watched the movie Money Ball staring Brad Pitt you will have a clue as to what I am seeking to do in the Daily Fantasy Sports World. I look forward to the adventure and  experience. 

http://grantland.com/features/2015-mlb-avm-systems-ken-mauriello-jack-armbruster-moneyball-sabermetrics/

http://www.wsj.com/articles/a-fantasy-sports-wizards-winning-formula-wsj-money-june-2014-1401893587

https://datafloq.com/read/fantasy-sports-betting-next-big-thing-big-data/1189

http://insidebigdata.com/2015/07/24/daily-fantasy-sports-is-booming-thanks-to-big-data/

http://www.ibisworld.com/industry/fantasy-sports-services.html

Professional Services
Sports and Fitness
Consumer Experience

$100/hr - $120/hr

Starts Aug 14, 2015

10 Proposals Status: IN PROGRESS

Client: F******* ****** ******* ***

Posted: Jul 24, 2015

Sampling and Clustering Analysis Collaboration in Azure Machine Learning

Our data and data experiments are underway in Azure Machine Learning.  We are looking for a statistics and data analysis expert to consult or collaborate with us in the Azure cloud environment.  Our analysis conducts energy benchmarking of a person's home in comparison with similar homes or peers.  We have problems with some biased samples vs. other samples that are representative.  We need assistance in clustering "peer groups", including weighting clusters, correcting for bias, and adjusting for cluster groups that are too small. 

Primary Skills Needed: Statistical Analysis, using Azure ML Experiment tools.  Regression, Cluster analysis, stratify, segment, re-weight, correcting for sample bias. (Note: statistical analysis is emphasized - working in Azure ML; not coding or data management skills needed here)

Start date: 7/26/15.  End date: 1-2 week engagement. 

Deliverables: Azure Machine Learning Experiment, regression coefficients, cluster analysis, as collaborator in our Azure ML Site

Energy and Utility
Customer Behavior Analysis
Customer Loyalty

$75/hr - $175/hr

Starts Jul 27, 2015

2 Proposals Status: CLOSED

Client: A******

Posted: Jul 24, 2015

Predicting best customers

We are a consulting and business art/gift business targeting enterprise clients for consulting, and broad business market for gifts and art. We focus on some specific industy and expertise verticals such as health care, higher education, HR and tech.

We have approximately 70,000 records of subscribers to our content and purchasers of our products and services. 

We are interested in profiling best customers in order to grow our retail market as well as identifying enterprise prospects who are aligned to our style of consulting.

Our data exists in CSV files, we have current subscribers to our daily email, unsubs, and retail customers. 

We are looking for an analysis of the data to determine prospect profiles, affinities, and action steps for execution of marketing. Possible other services required might be locating aligned established retailers and brands for partnership, etc. 

Consumer Goods and Retail
Healthcare
Media and Advertising

$5,000 - $15,000

Starts Aug 03, 2015

30 Proposals Status: CLOSED

Client: G**********

Posted: Jul 20, 2015

Extracting & Complying Court Online Data Of Pending Court Appearances

Need to develop a system/program to extract specific information from public court files(online lists of pending court dates), to then place into a program that will automatically locate the relevant email address for the attorney/law firm listed, then send them a form email with inserted information about their next court date(date, time, type, location, etc.)

Problem to solve is to identify attorneys/law firm with pending court appearances in near future(1-3 weeks) from available online court calendar dockets, extract the identify information about the attorney/law firm(this contact information on the court online service will not provide generally an email address, may provide a telephone number), then take the court date, time, court address and department number, go out an locate the attorney/law firm email address from online data, then take the court appearance data and insert it into a form email to be sent to the attorney/law firm about there pending court date.

Steps:

1.  System needs to go to each county/court district website, extract from from the calendar of each court room or court docket the name of the attorney, case name, case number, type of appearance, date of appearance, time of appearance, department location, court location.

An example to see how it is structured(not all court websites are exactly the same) is  http://www.lacourt.org/onlineservices/ON0001.aspx

Each court might term the calendar information slightly differently.

2.  Take the name of the attorney and go to either the State Bar of California or a general Google search to locate his/her email address at a minimum, but all contact information would be best

3.  Take the information from No. 1 along with the email from No. 2 and format them into a template email that will automatically then be sent to the attorney.  The system will need to automatically update itself either weekly or daily to retrieve the most recent calendar entries. 

In your bid, please submit total estimate time for completion.

Hi-Tech
Legal
e-CRM

$75/hr - $100/hr

Starts Jul 22, 2015

9 Proposals Status: CLOSED

Client: A********** ******** ** ***** ****

Posted: Jul 15, 2015

Matching Providers