Human Generated Data

Title

Untitled (woman driving car in field with small child in white at back window)

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6057

Human Generated Data

Title

Untitled (woman driving car in field with small child in white at back window)

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6057

Machine Generated Data

Tags

Amazon
created on 2019-05-30

Human 99.6
Person 99.6
Person 99.6
Person 99.5
Person 99.1
Vehicle 99
Automobile 99
Model T 99
Antique Car 99
Transportation 99
Person 97.7
Car 93.6
Wheel 93.4
Machine 93.4
Wheel 93.1
Person 88.8
Car 86.9
Tire 82.5
Wheel 77.8
Person 76.9
Spoke 74
Car Wheel 68.6
People 65.3
Person 63.7

Clarifai
created on 2019-05-30

snow 99.5
winter 99.2
people 98.8
vehicle 95.9
man 95.4
street 95
group 93
cold 92.8
adult 91.2
outdoors 89.9
two 89.3
woman 88.5
ice 87.5
wear 86.4
transportation system 86
travel 83.8
child 83.5
road 82.1
frost 81.6
recreation 80.6

Imagga
created on 2019-05-30

car 100
model t 72.2
motor vehicle 72.1
jeep 35
tire 31.4
road 28.9
vehicle 27.3
wheeled vehicle 24.6
hoop 20.8
transportation 20.6
snow 19.7
auto 19.1
automobile 18.2
drive 18
travel 17.6
old 17.4
sky 16.6
street 16.6
wheel 16.4
transport 15.5
band 15.3
winter 15.3
truck 14.5
landscape 14.1
trees 13.3
billboard 12.3
cold 12
speed 11.9
outdoors 11.2
tree 10.8
vintage 10.7
driving 10.6
weather 10.3
strip 10.2
light 10
outdoor 9.9
signboard 9.4
season 9.3
person 9.1
city 9.1
sign 9
vacation 9
rural 8.8
cars 8.8
urban 8.7
traffic 8.5
adventure 8.5
structure 8.3
ice 8.3
tourism 8.2
retro 8.2
building 8.1
pickup 7.9
country 7.9
sunny 7.7
motor 7.7
house 7.5
park 7.4
mountains 7.4
tourist 7.4
summer 7.1

Google
created on 2019-05-30

Microsoft
created on 2019-05-30

land vehicle 95.3
vehicle 94.6
wheel 88.9
car 83.2
person 81.8
clothing 78.9
man 68.9
tire 50.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Female, 50.5%
Happy 49.6%
Surprised 49.5%
Confused 49.6%
Angry 49.6%
Sad 50.1%
Calm 49.6%
Disgusted 49.5%

AWS Rekognition

Age 35-55
Gender Female, 50.2%
Disgusted 49.5%
Happy 49.5%
Calm 49.5%
Sad 50.4%
Confused 49.5%
Angry 49.5%
Surprised 49.5%

AWS Rekognition

Age 26-43
Gender Female, 50.4%
Happy 49.5%
Disgusted 49.5%
Confused 49.5%
Calm 49.6%
Angry 49.5%
Sad 50.4%
Surprised 49.5%

AWS Rekognition

Age 15-25
Gender Female, 50.2%
Confused 49.5%
Happy 49.5%
Angry 49.7%
Sad 49.7%
Surprised 49.5%
Calm 49.9%
Disgusted 49.6%

AWS Rekognition

Age 26-43
Gender Female, 50.4%
Confused 49.5%
Happy 49.5%
Angry 49.5%
Sad 50.2%
Surprised 49.5%
Calm 49.8%
Disgusted 49.5%

AWS Rekognition

Age 23-38
Gender Male, 50.2%
Disgusted 49.5%
Angry 49.5%
Surprised 49.6%
Confused 49.5%
Happy 49.6%
Sad 49.6%
Calm 50.1%

AWS Rekognition

Age 29-45
Gender Male, 50%
Disgusted 49.5%
Calm 49.7%
Surprised 49.5%
Sad 50.1%
Angry 49.6%
Confused 49.6%
Happy 49.5%

AWS Rekognition

Age 26-43
Gender Female, 50.1%
Happy 49.6%
Surprised 49.5%
Confused 49.5%
Angry 49.6%
Sad 50%
Calm 49.7%
Disgusted 49.5%

Feature analysis

Amazon

Person 99.6%
Car 93.6%
Wheel 93.4%

Categories