Human Generated Data

Title

Untitled (nine children posing with goat on outdoor steps)

Date

1961

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9820

Human Generated Data

Title

Untitled (nine children posing with goat on outdoor steps)

People

Artist: Martin Schweig, American 20th century

Date

1961

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Human 99.1
Person 99.1
Person 98
Person 97.8
Person 97.5
Person 96
Person 93.3
Mammal 87.9
Animal 87.9
Person 87.3
Horse 77.1
Outdoors 70.4
Amusement Park 69.7
Theme Park 69.7
Road 65.4
Cattle 58.5
Boat 57.2
Transportation 57.2
Vehicle 57.2
Bull 55.9

Imagga
created on 2022-01-28

television 42.1
telecommunication system 32.1
old 16.7
man 14.7
art 13.7
outdoors 12.1
black 12
people 11.7
outdoor 11.5
water 10.7
vintage 9.9
sculpture 9.7
fountain 9.5
portrait 9.1
history 8.9
landscape 8.9
sport 8.9
statue 8.7
scene 8.6
decoration 8.5
travel 8.4
wood 8.3
person 8.3
retro 8.2
aged 8.1
religion 8.1
architecture 7.8
antique 7.8
sitting 7.7
structure 7.7
culture 7.7
tree 7.7
texture 7.6
fashion 7.5
traditional 7.5
symbol 7.4
life 7.3
color 7.2
dress 7.2
adult 7.1
textured 7

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

text 94
black and white 85.8
horse 84.4
black 76.5
animal 67
mammal 50.7

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Male, 99.8%
Sad 85.5%
Calm 9.1%
Confused 3.1%
Happy 0.5%
Angry 0.5%
Disgusted 0.5%
Surprised 0.4%
Fear 0.3%

AWS Rekognition

Age 54-64
Gender Male, 98.8%
Sad 36.1%
Calm 31.7%
Happy 10.2%
Confused 7.6%
Angry 4.8%
Disgusted 4.2%
Surprised 2.8%
Fear 2.6%

AWS Rekognition

Age 28-38
Gender Male, 74%
Happy 52.4%
Calm 27.6%
Sad 11.3%
Fear 5.1%
Angry 1%
Disgusted 1%
Confused 0.9%
Surprised 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Horse 77.1%
Boat 57.2%

Captions

Microsoft

a person sitting on a bench with a dog 74.7%
a man and a woman sitting on a bench 65.5%
a couple of people that are sitting on a bench 65.4%

Text analysis

Amazon

EDICO
KODAK--S.VEEIA--EIFW