Get data from any page you want to get data from.
Need to talk to someone? Contact us—we’d love to help.
Get data from any page you want to get data from.
Need to talk to someone? Contact us—we’d love to help.
We are still busy preparing this batch of data. Please come back in a few minutes.
Seems like this data source was never ran before...
Changes are only available only when you have ran at least a second time.
Nope... guess no Martians around... Maybe set the webhook URL before pressing this button again...
text | School student from and graduate year | Column 3 | origin_pattern | origin_url | createdAt | updatedAt | pingedAt |
---|---|---|---|---|---|---|---|
He had gone to Shore (2016) I had been to Pymble (2016) it was after a party, we ... | - Pymble Ladies College 2016 | https://www.teachusconsent.com/testimonies | https://www.teachusconsent.com/testimonies | 2021-03-04 01:38:41 | 2021-03-04 01:38:41 | 2021-03-04 01:38:41 | |
I can recall countless instances of boys differing in age at a number of private ... | - Scots 2014 | https://www.teachusconsent.com/testimonies | https://www.teachusconsent.com/testimonies | 2021-03-04 01:38:41 | 2021-03-04 01:38:41 | 2021-03-04 01:38:41 | |
I had just started year 9 he was my boyfriend from SCOTS and I went to his house ... | - Kambala | https://www.teachusconsent.com/testimonies | https://www.teachusconsent.com/testimonies | 2021-03-04 01:38:41 | 2021-03-04 01:38:41 | 2021-03-04 01:38:41 | |
I had just turned 18 and my boyfriend was bringing me home to bed from my party. ... | - Kambala | https://www.teachusconsent.com/testimonies | https://www.teachusconsent.com/testimonies | 2021-03-04 01:38:41 | 2021-03-04 01:38:41 | 2021-03-04 01:38:41 | |
I was 16 at a party and was kissing a Knox boy in the year above me. He asked me ... | - Brigidine 2015 | https://www.teachusconsent.com/testimonies | https://www.teachusconsent.com/testimonies | 2021-03-04 01:38:41 | 2021-03-04 01:38:41 | 2021-03-04 01:38:41 | |
I was 16 years old - and went out with a group of friends in Mosman to a party. I... | - Monte Sant’ Angelo Mercy College 2014 | https://www.teachusconsent.com/testimonies | https://www.teachusconsent.com/testimonies | 2021-03-04 01:38:41 | 2021-03-04 01:38:41 | 2021-03-04 01:38:41 | |
I was hosting and i had around 30 people in my house. I don’t really remember how... | - kambala 2022 | https://www.teachusconsent.com/testimonies | https://www.teachusconsent.com/testimonies | 2021-03-04 01:38:41 | 2021-03-04 01:38:41 | 2021-03-04 01:38:41 | |
I was in year 10 when a Sydney Grammar boy raped me. He was my “friend” which gav... | - Kambala | https://www.teachusconsent.com/testimonies | https://www.teachusconsent.com/testimonies | 2021-03-04 01:38:41 | 2021-03-04 01:38:41 | 2021-03-04 01:38:41 | |
I went to Kincoppal and he went to Cranbrook. We were in year 9. He was the first... | - Kincoppal - Rose Bay 2015 | about sign petition add testimony read all testimonies email school & MP contact ... | https://www.teachusconsent.com/testimonies | https://www.teachusconsent.com/testimonies | 2021-03-04 01:38:41 | 2021-03-04 01:38:41 | 2021-03-04 01:38:41 |
I went to St Catherine’s and he went to Cranbrook. It was New Year’s Eve and I wa... | - St Catherine’s 2017 | https://www.teachusconsent.com/testimonies | https://www.teachusconsent.com/testimonies | 2021-03-04 01:38:41 | 2021-03-04 01:38:41 | 2021-03-04 01:38:41 |
Sample code snippets to quickly import data set into your application
For more information on how to automatically trigger an import please reference our WebHook API guide
Integrating with Java
import java.io.BufferedReader; import java.io.InputStreamReader; import java.net.URL; import java.net.URLConnection; import java.util.Arrays; public class HelloWorld { public static void main(String[] args) { try { URL urlCSV = new URL( "https://cache.getdata.io/n85098_fbeaa932a90358e7d1a281ce60a4143beses/latest_all.csv" ); URLConnection urlConn = urlCSV.openConnection(); InputStreamReader inputCSV = new InputStreamReader( ((URLConnection) urlConn).getInputStream() ); BufferedReader br = new BufferedReader(inputCSV); String line; String[] fields; while ((line = br.readLine()) != null) { // Each row fields = line.split(","); System.out.println(Arrays.toString(fields)); } // clean up buffered reader br.close(); } catch (Exception e) { System.out.println(e.getMessage()); } } }
Integrating with NodeJs
const csv = require('csv-parser'); const https = require('https'); const fs = require('fs'); const file = fs.createWriteStream("temp_download.csv"); const request = https.get( "https://cache.getdata.io/n85098_fbeaa932a90358e7d1a281ce60a4143beses/latest_all.csv", function(response) { response.pipe(file); } ); file.on('finish', function() { file.close(); fs.createReadStream('temp_download.csv').pipe(csv()).on('data', (row) => { // Each row console.log(row); }).on('end', () => { console.log('CSV file successfully processed'); }); });
Integrating with PHP
$data = file_get_contents("https://cache.getdata.io/n85098_fbeaa932a90358e7d1a281ce60a4143beses/latest_all.csv"); $rows = explode("\n",$data); $s = array(); foreach($rows as $row) { # Each row var_dump( $row); }
Integrating with Python
import csv import urllib2 url = 'https://cache.getdata.io/n85098_fbeaa932a90358e7d1a281ce60a4143beses/latest_all.csv' response = urllib2.urlopen(url) cr = csv.reader(response) for row in cr: # Each row print row
Integrating with Ruby
require 'open-uri' require 'tempfile' require 'csv' temp_file = Tempfile.new( "getdata", :encoding => 'ascii-8bit') temp_file << open("https://cache.getdata.io/n85098_fbeaa932a90358e7d1a281ce60a4143beses/latest_all.csv").read temp_file.rewind CSV.foreach( open(uri), :headers => :first_row ).each do |row| # Each row puts row end
We monitor the news headline from this website which we then use to train our sentiment analysis e...
created on 2024-01-02
We monitor the news headline from this website which we then use to train our sentiment analysis e...
created on 2024-01-01
We monitor the news headline from this website which we then use to train our sentiment analysis e...
created on 2023-10-09
We monitor the news headline from this website which we then use to train our sentiment analysis e...
created on 2022-07-06