Human Generated Data

Title

Untitled (two couples looking into shop window at mysterious vacuum display)

Date

1948

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14616

Human Generated Data

Title

Untitled (two couples looking into shop window at mysterious vacuum display)

People

Artist: Jack Gould, American

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Human 99.7
Person 99.7
Person 99.4
Person 99.2
Person 99
Clothing 93.9
Apparel 93.9
Female 78.2
Suit 64.8
Coat 64.8
Overcoat 64.8
Shorts 61.6
Advertisement 61.6
Appliance 61.2
Woman 58.9
People 58.6
Poster 57
Dress 56.9

Imagga
created on 2022-01-29

home 33.5
kitchen 32.1
adult 24.8
indoors 22.8
room 22.5
people 21.7
man 20.1
person 19.6
male 17.7
interior 17.7
newspaper 17.4
smiling 16.6
holding 15.7
happy 15.7
lifestyle 15.2
cheerful 14.6
professional 13.9
product 13.4
house 13.4
happiness 13.3
business 12.7
work 12.5
men 12
one 11.9
shop 11.8
food 11.6
smile 11.4
cooking 11.3
modern 11.2
device 11.1
portrait 11
cook 11
barbershop 10.9
domestic 10.8
worker 10.7
creation 10.6
medical 10.6
toilet 10.5
stethoscope 10.3
family 9.8
office 9.7
looking 9.6
standing 9.6
women 9.5
one person 9.4
casual 9.3
pretty 9.1
life 8.8
equipment 8.5
senior 8.4
meal 8.3
cup 8.3
home appliance 8.3
20s 8.2
businesswoman 8.2
technology 8.2
computer 8
job 8
working 7.9
housewife 7.9
preparing 7.8
chef 7.8
sitting 7.7
old 7.7
preparation 7.6
health 7.6
machine 7.6
plate 7.6
fashion 7.5
joy 7.5
doctor 7.5
human 7.5
inside 7.4
occupation 7.3
dress 7.2
mercantile establishment 7.2
medicine 7
table 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 99
clothing 88.7
person 88.1
old 51.3

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 98%
Happy 47.3%
Calm 32.7%
Surprised 8%
Sad 3.5%
Disgusted 2.8%
Fear 2.6%
Angry 2.2%
Confused 0.9%

AWS Rekognition

Age 35-43
Gender Male, 97.4%
Calm 58.2%
Confused 16.7%
Sad 10.9%
Disgusted 6.3%
Surprised 2.6%
Fear 2.3%
Angry 1.8%
Happy 1.2%

AWS Rekognition

Age 50-58
Gender Male, 59.2%
Calm 96.9%
Sad 1%
Confused 0.7%
Happy 0.6%
Angry 0.3%
Surprised 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 23-31
Gender Female, 54.7%
Happy 94.6%
Calm 2.5%
Sad 0.6%
Confused 0.6%
Disgusted 0.6%
Angry 0.5%
Fear 0.3%
Surprised 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 74.9%
a vintage photo of a group of people posing for a picture 74.8%
a vintage photo of a person 74.7%

Text analysis

Amazon

FILTER
REMINGTON
MJI7
FILTER una
MJI7 YE3RAS
YE3RAS
una

Google

REMIRGTO
TER
MJI7 YT3RA2 0 REMIRGTO TER
MJI7
0
YT3RA2