Human Generated Data

Title

Untitled (group photograph with two men standing on racing car)

Date

c. 1935-1940

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4378

Human Generated Data

Title

Untitled (group photograph with two men standing on racing car)

People

Artist: Durette Studio, American 20th century

Date

c. 1935-1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4378

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 99.7
Human 99.7
Person 99.7
Person 99.4
Person 99.3
Person 99.2
Person 96.7
Person 95
Wheel 94.3
Machine 94.3
Vehicle 94.3
Transportation 94.3
Automobile 94.3
Car 94.3
Person 90.9
People 90.6
Person 86.7
Person 82.5
Shorts 77.9
Clothing 77.9
Apparel 77.9
Face 71.9
Spoke 69.9
Person 68.7
Person 68.3
Wheel 67.8
Female 67.3
Housing 66.2
Building 66.2
Child 64.6
Kid 64.6
Family 63.3
Portrait 61.5
Photo 61.5
Photography 61.5
Tire 61.1
Buggy 57.4
Girl 56
Carriage 55.5
Antique Car 55.3
Model T 55.3
Wagon 55.1
Person 41.8

Clarifai
created on 2019-06-01

people 99.9
vehicle 99.2
group 98.6
adult 98.2
man 96.9
transportation system 96.6
group together 96.4
nostalgia 93.3
carriage 92.9
woman 91.5
illustration 90.4
wagon 90.4
two 90.3
child 90.3
driver 88.9
several 88.6
four 87.2
cavalry 87
administration 85.6
many 84.9

Imagga
created on 2019-06-01

vehicle 24.9
people 23.4
tricycle 23.1
wheeled vehicle 21.2
man 20.6
old 19.5
wheelchair 18.4
couple 17.4
outdoors 17.3
kin 17.3
person 16.7
male 16.4
outdoor 16
transportation 15.2
adult 14.4
love 14.2
bride 13.1
groom 12.9
wheel 12.3
sky 12.1
summer 11.6
park 11.5
married 11.5
chair 11.5
portrait 11
happy 10.7
wife 10.4
women 10.3
cart 10.3
outside 10.3
wedding 10.1
transport 10
aged 10
disabled 9.9
conveyance 9.6
life 9.6
car 9.5
happiness 9.4
field 9.2
dress 9
husband 8.8
grass 8.7
lifestyle 8.7
bouquet 8.6
men 8.6
marriage 8.5
two 8.5
antique 8.4
vintage 8.3
sport 8.2
care 8.2
history 8
family 8
carriage 8
day 7.8
wagon 7.8
mother 7.8
clothing 7.8
ancient 7.8
travel 7.7
fashion 7.5
traditional 7.5
world 7.3
smiling 7.2
activity 7.2
statue 7.2
romance 7.1
horse 7.1
romantic 7.1
sunlight 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

outdoor 97.4
person 88.6
land vehicle 85.5
old 83.5
vehicle 83
clothing 80.5
wheel 76
transport 75.9
drawn 66.6
pulling 66.1
car 63.4
man 61.2
carriage 57.4
posing 38

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 53.2%
Happy 45.7%
Confused 45.4%
Angry 45.5%
Disgusted 45.4%
Surprised 45.5%
Sad 47.4%
Calm 50%

AWS Rekognition

Age 4-7
Gender Female, 54.5%
Angry 45.4%
Happy 45.4%
Sad 47.8%
Disgusted 45.5%
Confused 45.3%
Calm 50.1%
Surprised 45.4%

AWS Rekognition

Age 20-38
Gender Female, 50.8%
Calm 47.7%
Surprised 45.3%
Disgusted 45.2%
Happy 45.3%
Sad 49.2%
Confused 45.4%
Angry 47%

Feature analysis

Amazon

Person 99.7%
Wheel 94.3%
Car 94.3%

Categories

Text analysis

Amazon

7