Human Generated Data

Title

Untitled (photograph of people at picnic taken from end of table)

Date

c. 1910-1920

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3691

Human Generated Data

Title

Untitled (photograph of people at picnic taken from end of table)

People

Artist: Durette Studio, American 20th century

Date

c. 1910-1920

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 99.5
Human 99.5
Person 97.8
Person 97.5
Person 97.4
Person 96.9
Furniture 90
Table 89.5
Person 89.2
Dining Table 88.2
People 75.4
Meal 74.6
Food 74.6
Person 73.5
Art 68.4
Dating 66.1
Outdoors 59.7
Room 58.2
Indoors 58.2
Restaurant 56.9
Nature 56
Person 45.3

Clarifai
created on 2019-06-01

people 99.8
adult 98.3
group together 97.2
group 97
man 96.6
wear 93.7
child 93.4
many 91.5
woman 90.3
vehicle 89.9
two 86.2
several 83
four 81.5
military 81.1
sit 80.9
leader 75.5
three 75.3
war 75.2
administration 72.1
recreation 72

Imagga
created on 2019-06-01

negative 100
film 99.6
photographic paper 75.5
photographic equipment 50.3
winter 20.4
snow 19.8
cold 17.2
sketch 17.1
ice 16.4
water 16
people 13.9
city 13.3
drawing 12.5
frozen 12.4
outdoors 11.9
old 11.8
scene 11.2
building 11.2
decoration 11.1
sky 10.8
frost 10.6
human 10.5
tree 10
representation 10
art 9.8
black 9.6
season 9.3
religion 9
cool 8.9
group 8.9
seasonal 8.8
love 8.7
architecture 8.6
men 8.6
house 8.4
purity 8.3
person 8.1
man 8.1
river 8
surface 7.9
holiday 7.9
forest 7.8
weather 7.5
traditional 7.5
landscape 7.4
smooth 7.3
detail 7.2
dress 7.2
celebration 7.2
history 7.2
male 7.1
work 7.1
businessman 7.1
travel 7
glass 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

person 95.4
window 89.5
text 88.3
clothing 85.5
drawing 84.6
man 83.6
sketch 72.6
table 58.1
old 56.9
black and white 51.1

Face analysis

Amazon

AWS Rekognition

Age 35-52
Gender Female, 53.9%
Disgusted 8%
Sad 26.6%
Surprised 8.8%
Happy 21.3%
Angry 5.1%
Calm 17.1%
Confused 13.3%

AWS Rekognition

Age 26-43
Gender Male, 54.8%
Disgusted 45.1%
Sad 47.6%
Happy 45.1%
Surprised 45.3%
Angry 45.6%
Calm 50.8%
Confused 45.4%

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a group of people posing for a photo 76.1%
a group of people standing next to a window 73.2%
a group of people standing in front of a window 70.4%