Human Generated Data

Title

Untitled (adults and children sitting on large rocks outside rustic cabin)

Date

c. 1950, printed later

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13820

Human Generated Data

Title

Untitled (adults and children sitting on large rocks outside rustic cabin)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

c. 1950, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99
Person 99
Person 96.1
Person 94.1
Advertisement 87.9
Poster 87.9
Collage 87.9
Mammal 86.8
Horse 86.8
Animal 86.8
Apparel 86.3
Clothing 86.3
Person 83.2
Person 77.4
Person 77.4
Person 77.3
Display 76
Screen 76
Electronics 76
Monitor 76
Face 74.5
Person 72.4
People 65.7
Sailor Suit 63
LCD Screen 62.6
Person 62
Person 59
Fashion 55.9
Robe 55.9
Military 55.3

Clarifai
created on 2019-11-16

people 99.8
group 97.9
man 95.7
adult 94.7
many 94.5
vehicle 94.1
no person 93.9
one 92.2
war 92
two 89.5
monochrome 88.1
child 86.1
wear 85.4
movie 85.3
furniture 85
woman 84
military 82.6
art 82.6
indoors 81
music 79.4

Imagga
created on 2019-11-16

shop 74.6
mercantile establishment 56.2
shoe shop 52
case 44.9
place of business 37.4
barbershop 29.5
old 20.9
establishment 18.7
window 18
building 16.7
ancient 16.4
vintage 15.7
stall 15.7
architecture 14.8
antique 14.7
history 14.3
black 12.6
wall 12
art 11.7
glass 11.7
city 11.6
interior 11.5
retro 11.5
grunge 11.1
historic 11
house 10.9
light 10.7
people 10
aged 9.9
dirty 9.9
urban 9.6
brown 8.8
decoration 8.8
man 8.7
texture 8.3
religion 8.1
holiday 7.9
detail 7.2
home 7.2
travel 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

black and white 95.6
text 94.2
person 88.9
clothing 87.6
monochrome 84.8
street 72.8
room 40.3
old 40

Face analysis

Amazon

AWS Rekognition

Age 13-25
Gender Female, 50.7%
Confused 45.2%
Surprised 45.2%
Happy 52.5%
Sad 45.2%
Disgusted 45.2%
Angry 45.1%
Calm 46.6%
Fear 45.1%

AWS Rekognition

Age 36-54
Gender Male, 51.1%
Happy 54.1%
Fear 45.1%
Sad 45.1%
Confused 45%
Disgusted 45%
Angry 45.1%
Calm 45.4%
Surprised 45.1%

AWS Rekognition

Age 3-9
Gender Female, 54.5%
Angry 45.3%
Confused 45.2%
Disgusted 45.1%
Happy 45%
Sad 46.6%
Fear 51.2%
Calm 46.1%
Surprised 45.5%

AWS Rekognition

Age 2-8
Gender Female, 51.3%
Fear 50.4%
Sad 45.8%
Disgusted 45%
Surprised 47.6%
Calm 45.8%
Happy 45%
Angry 45.2%
Confused 45.2%

AWS Rekognition

Age 24-38
Gender Female, 50%
Angry 49.5%
Happy 49.6%
Disgusted 49.5%
Confused 49.5%
Surprised 49.5%
Calm 50.2%
Fear 49.5%
Sad 49.6%

AWS Rekognition

Age 17-29
Gender Male, 50.2%
Happy 49.5%
Fear 49.5%
Disgusted 49.5%
Angry 49.5%
Surprised 49.5%
Sad 50.3%
Confused 49.5%
Calm 49.6%

AWS Rekognition

Age 14-26
Gender Male, 50.1%
Sad 49.8%
Surprised 49.5%
Angry 49.6%
Calm 49.6%
Happy 49.7%
Fear 49.7%
Disgusted 49.5%
Confused 49.5%

AWS Rekognition

Age 23-35
Gender Female, 50.1%
Angry 50%
Happy 49.5%
Disgusted 49.5%
Confused 49.5%
Surprised 49.5%
Calm 49.5%
Fear 49.6%
Sad 49.9%

AWS Rekognition

Age 26-40
Gender Female, 50.1%
Surprised 49.5%
Angry 49.6%
Happy 49.5%
Fear 49.9%
Calm 49.5%
Disgusted 49.5%
Sad 50%
Confused 49.6%

AWS Rekognition

Age 5-15
Gender Male, 53.2%
Surprised 45%
Happy 45%
Confused 45%
Sad 54.2%
Angry 45.1%
Fear 45.4%
Disgusted 45%
Calm 45.3%

AWS Rekognition

Age 33-49
Gender Female, 50.3%
Calm 49.5%
Disgusted 49.5%
Sad 49.5%
Happy 49.5%
Angry 49.5%
Fear 50.5%
Surprised 49.5%
Confused 49.5%

Feature analysis

Amazon

Person 99%
Horse 86.8%

Captions

Microsoft

a group of people standing in front of a window 56.7%
a group of people standing next to a window 56.5%
a group of people in a room 56.4%

Text analysis

Amazon

GOREE

Google

GOREE
GOREE