Human Generated Data

Title

Untitled (man working in foundry)

Date

c. 1946

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15995

Human Generated Data

Title

Untitled (man working in foundry)

People

Artist: Jack Gould, American

Date

c. 1946

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-25

Human 99.7
Person 99.7
Carpenter 96.2
Wood 88.4
Plywood 76.6
Box 66.4
Apparel 63.2
Clothing 63.2
Machine 61.1
Shorts 59.9
Tool 57.8
Pants 57.6

Imagga
created on 2022-03-25

man 30.9
musical instrument 29.1
person 27.8
male 26.9
people 23.4
stringed instrument 20.6
adult 20.1
lifestyle 17.3
black 16.8
portrait 14.2
holding 13.2
happy 13.2
device 13.1
room 13.1
music 13
play 12.9
home 12.8
violin 12.7
indoors 12.3
smile 12.1
bowed stringed instrument 11.6
chair 11.3
boy 11.3
one 11.2
body 11.2
sitting 11.2
casual 11
happiness 11
musician 10.9
guitar 10.8
child 10.5
attractive 10.5
performer 10.4
style 10.4
wind instrument 10.3
men 10.3
youth 10.2
washboard 10
joy 10
newspaper 9.9
cheerful 9.8
interior 9.7
smiling 9.4
instrument 9.2
dark 9.2
human 9
fun 9
handsome 8.9
product 8.9
sexy 8.8
hair 8.7
guy 8.7
fashion 8.3
occupation 8.2
playing 8.2
banjo 8.1
accordion 8.1
posing 8
businessman 7.9
clothing 7.9
rock 7.8
pretty 7.7
performance 7.7
studio 7.6
singer 7.6
blond 7.5
senior 7.5
business 7.3
world 7.3
holiday 7.2
teacher 7.1
women 7.1
face 7.1
kid 7.1
keyboard instrument 7

Google
created on 2022-03-25

Microsoft
created on 2022-03-25

text 94
black and white 82.7
black 72.1
person 71.2
old 69
statue 57.2
man 55.9

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Female, 53%
Sad 71%
Disgusted 11.7%
Calm 9.5%
Angry 4.2%
Surprised 1.3%
Happy 1.3%
Fear 0.5%
Confused 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a vintage photo of a person 79.9%
a vintage photo of a person 79.8%
an old photo of a person 77.6%