Human Generated Data

Title

Untitled (men looking at vacuum demonstration in shop window)

Date

1948

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14605

Human Generated Data

Title

Untitled (men looking at vacuum demonstration in shop window)

People

Artist: Jack Gould, American

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 98.8
Human 98.8
Person 97.9
Apparel 94.9
Clothing 94.9
Person 93.5
Accessory 87.3
Tie 87.3
Accessories 87.3
Indoors 86.2
Interior Design 86.2
Steamer 80.5
Room 73.6
Appliance 68.2
Clinic 65.6
Home Decor 58
Dress 57.3
Coat 56.9
Cleaning 56.1
Shorts 56

Imagga
created on 2022-01-29

medical 25.6
people 24.5
adult 23.5
person 23.4
male 22
home 21.5
man 20.8
kitchen 20.1
indoors 18.4
professional 17.2
interior 16.8
medicine 16.7
work 15.7
patient 15.5
modern 15.4
health 15.3
device 15.2
happy 15
doctor 15
nurse 14.9
hospital 14.6
clinic 14
care 14
occupation 13.7
smiling 13.7
room 13.7
portrait 12.9
house 12.5
exam 12.4
lifestyle 12.3
office 12.3
men 12
coat 12
stethoscope 11.7
chair 11.5
smile 11.4
clothing 11.3
indoor 10.9
job 10.6
working 10.6
standing 10.4
business 10.3
women 10.3
science 9.8
clinical 9.8
instrument 9.7
worker 9.3
20s 9.2
pretty 9.1
holding 9.1
human 9
cooking 8.7
chemistry 8.7
laboratory 8.7
equipment 8.6
happiness 8.6
hygiene 8.5
lab coat 8.4
clean 8.3
glasses 8.3
beaker 8.3
cheerful 8.1
life 8
water 8
bathroom 7.9
hands 7.8
microphone 7.8
assistant 7.8
lab 7.8
chemical 7.8
casual 7.6
research 7.6
elegance 7.6
healthy 7.6
fashion 7.5
shop 7.5
technology 7.4
inside 7.4
building 7.3
food 7.2
specialist 7.2
student 7.2
domestic 7.2
dress 7.2
looking 7.2
cleaner 7.2
sink 7.2
table 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 98.9
black and white 67.1
clothing 50.9
posing 45.8

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Female, 89.1%
Surprised 46.2%
Calm 44%
Sad 3.7%
Happy 2.6%
Disgusted 1.5%
Angry 0.9%
Fear 0.7%
Confused 0.4%

AWS Rekognition

Age 23-33
Gender Female, 82%
Calm 62.4%
Sad 11%
Happy 9.6%
Surprised 7.6%
Disgusted 3.6%
Angry 2%
Fear 1.9%
Confused 1.8%

AWS Rekognition

Age 42-50
Gender Male, 99.3%
Calm 90%
Happy 7.4%
Sad 1.1%
Angry 0.6%
Confused 0.3%
Fear 0.3%
Disgusted 0.2%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Tie 87.3%

Captions

Microsoft

a person standing in front of a store 60.7%
a person holding a racket 38.3%
a person with a racket 38.2%

Text analysis

Amazon

QUEEN
REMINGTON
HLTER QUEEN
HLTER
0)
MJI7
MJI7 YT3RAS
YT3RAS
IIII

Google

TER
QUEEN
REMIDGTO
MJ17 YT3RA2 REMIDGTO TER QUEEN
YT3RA2
MJ17