Get data from any page you want to get data from.
Need to talk to someone? Contact us—we’d love to help.
Get data from any page you want to get data from.
Need to talk to someone? Contact us—we’d love to help.
We are still busy preparing this batch of data. Please come back in a few minutes.
Seems like this data source was never ran before...
Changes are only available only when you have ran at least a second time.
Nope... guess no Martians around... Maybe set the webhook URL before pressing this button again...
In this recipe we gather a list of URLs to the categories where seed stage investors usually invest in. Listed in each of these category pages are profiles of top seed stage investors in that that category.
This recipe is best used in combination with the Top Seed Investors in an investment category recipe
Pick and choose the category you want top investor profiles for and then use Top Seed Investors in an investment category to gather the actual profile information.
While you are waiting, check out the 29,570 public data sources
contributed by the rest of our community
Sample code snippets to quickly import data set into your application
For more information on how to automatically trigger an import please reference our WebHook API guide
Integrating with Java
import java.io.BufferedReader; import java.io.InputStreamReader; import java.net.URL; import java.net.URLConnection; import java.util.Arrays; public class HelloWorld { public static void main(String[] args) { try { URL urlCSV = new URL( "https://cache.getdata.io/n93222_085fea7abdc5d904fe69a3081efd7398eses/latest_all.csv" ); URLConnection urlConn = urlCSV.openConnection(); InputStreamReader inputCSV = new InputStreamReader( ((URLConnection) urlConn).getInputStream() ); BufferedReader br = new BufferedReader(inputCSV); String line; String[] fields; while ((line = br.readLine()) != null) { // Each row fields = line.split(","); System.out.println(Arrays.toString(fields)); } // clean up buffered reader br.close(); } catch (Exception e) { System.out.println(e.getMessage()); } } }
Integrating with NodeJs
const csv = require('csv-parser'); const https = require('https'); const fs = require('fs'); const file = fs.createWriteStream("temp_download.csv"); const request = https.get( "https://cache.getdata.io/n93222_085fea7abdc5d904fe69a3081efd7398eses/latest_all.csv", function(response) { response.pipe(file); } ); file.on('finish', function() { file.close(); fs.createReadStream('temp_download.csv').pipe(csv()).on('data', (row) => { // Each row console.log(row); }).on('end', () => { console.log('CSV file successfully processed'); }); });
Integrating with PHP
$data = file_get_contents("https://cache.getdata.io/n93222_085fea7abdc5d904fe69a3081efd7398eses/latest_all.csv"); $rows = explode("\n",$data); $s = array(); foreach($rows as $row) { # Each row var_dump( $row); }
Integrating with Python
import csv import urllib2 url = 'https://cache.getdata.io/n93222_085fea7abdc5d904fe69a3081efd7398eses/latest_all.csv' response = urllib2.urlopen(url) cr = csv.reader(response) for row in cr: # Each row print row
Integrating with Ruby
require 'open-uri' require 'tempfile' require 'csv' temp_file = Tempfile.new( "getdata", :encoding => 'ascii-8bit') temp_file << open("https://cache.getdata.io/n93222_085fea7abdc5d904fe69a3081efd7398eses/latest_all.csv").read temp_file.rewind CSV.foreach( open(uri), :headers => :first_row ).each do |row| # Each row puts row end
An API of Andressen Horrowitz's portfolio companies. We integrate this API into our own market ins...
created on 2021-06-29
An API of Andressen Horrowitz's portfolio companies. We integrate this API into our own market ins...
created on 2021-06-29
An API of Andressen Horrowitz's portfolio companies. We integrate this API into our own market ins...
created on 2021-06-23
An API of Ame Cloud Ventures' portfolio companies. We integrate this API into our own market insig...
created on 2021-06-23