Human Generated Data

Title

Untitled (family inside run-down house)

Date

1957

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16000.3

Human Generated Data

Title

Untitled (family inside run-down house)

People

Artist: Jack Gould, American

Date

1957

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.8
Human 99.8
Person 99.4
Person 98.2
Room 97.2
Indoors 97.2
Furniture 79.3
People 71
Dressing Room 70.4
Apparel 66.3
Clothing 66.3
Advertisement 60.3
Interior Design 59.4

Imagga
created on 2022-02-05

shop 46.7
barbershop 35
mercantile establishment 31.2
home 28.7
interior 27.4
refrigerator 25.9
sliding door 25.6
door 24.5
white goods 23.8
house 22.6
place of business 20.8
window 20.1
people 19.5
home appliance 19.2
man 18.8
indoors 17.6
room 17.1
furniture 16.4
case 16.2
person 15.7
inside 15.6
cabinet 15.4
movable barrier 15.4
appliance 14.2
modern 14
happy 13.8
male 12.8
light 12.7
kitchen 12.3
adult 12.3
smile 12.1
luxury 12
indoor 11.9
looking 11.2
women 11.1
style 10.4
portrait 10.4
establishment 10.3
business 10.3
medicine chest 10.2
work 10.2
barrier 10.2
architecture 10.1
decoration 10.1
glass 10.1
city 10
worker 9.8
old 9.8
building 9.7
design 9.6
wall 9.4
buy 9.4
fashion 9
black 9
shoe shop 8.7
smiling 8.7
apartment 8.6
men 8.6
retail 8.5
floor 8.4
wood 8.3
lady 8.1
new 8.1
lifestyle 7.9
urban 7.9
standing 7.8
ancient 7.8
pretty 7.7
attractive 7.7
bathroom 7.6
hotel 7.6
casual 7.6
clothing 7.6
vintage 7.4
shopping 7.3
clinic 7.2
dress 7.2
open 7.2
life 7.2
job 7.1
happiness 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 98
black and white 91.1
clothing 76.8
standing 75.5
person 73
house 54.4

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Female, 95%
Calm 90.9%
Disgusted 3.7%
Surprised 1.6%
Fear 1.3%
Angry 1%
Sad 0.7%
Happy 0.6%
Confused 0.2%

AWS Rekognition

Age 23-33
Gender Female, 99.7%
Calm 48.3%
Happy 31.3%
Disgusted 5.2%
Fear 5.2%
Surprised 3.1%
Sad 2.8%
Angry 2.1%
Confused 2%

AWS Rekognition

Age 35-43
Gender Female, 70.9%
Disgusted 73.6%
Calm 22.5%
Sad 0.9%
Happy 0.7%
Fear 0.6%
Confused 0.6%
Angry 0.5%
Surprised 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a group of people standing on top of a refrigerator 71.3%
a group of people standing in front of a refrigerator 71.2%
a group of people standing next to a refrigerator 71.1%

Text analysis

Amazon

MAGON
٤١٢и

Google

HAGOX
HAGOX