Human Generated Data

Title

Untitled (man and woman looking into shop window at mysterious vacuum display)

Date

1948

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14621

Human Generated Data

Title

Untitled (man and woman looking into shop window at mysterious vacuum display)

People

Artist: Jack Gould, American

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.5
Human 99.5
Vehicle 98.5
Car 98.5
Automobile 98.5
Transportation 98.5
Person 97.8
Clothing 97.8
Apparel 97.8
Home Decor 86.5
Appliance 80.2
Overcoat 73.1
Coat 73.1
Suit 73.1
Hat 64.3
Sun Hat 60.6
Plant 60
Dish 59.5
Food 59.5
Meal 59.5
Photography 59.1
Portrait 59.1
Photo 59.1
Face 59.1
Female 58.4
Shorts 57

Imagga
created on 2022-01-29

chair 27
seat 25.9
interior 23.9
furniture 23.7
room 23.2
barber chair 22.2
home 21.5
bathroom 19.9
equipment 19.8
modern 19.6
device 18.2
toilet 15.7
hospital 15.6
technology 15.6
clean 15
people 14.5
house 14.2
medical 14.1
inside 12.9
patient 12.4
indoors 12.3
window 12.2
shop 11.8
work 11.8
wash 11.6
design 11.2
wall 11.1
health 11.1
iron lung 11
toilet seat 11
architecture 10.9
3d 10.8
furnishing 10.5
clinic 10.4
doctor 10.3
basin 10.3
floor 10.2
barbershop 10
light 10
sink 10
male 9.9
salon 9.9
building 9.9
faucet 9.8
digital 9.7
bath 9.5
office 9.5
nurse 9.3
kitchen 9.2
business 9.1
old 9
person 8.9
respirator 8.8
medicine 8.8
man 8.7
instrument 8.5
machine 8.5
adult 8.4
computer 8.1
shower 8
tile 8
decoration 8
working 7.9
life 7.8
render 7.8
industry 7.7
city 7.5
washbasin 7.4
care 7.4
appliance 7.3
water 7.3
indoor 7.3
breathing device 7.3
metal 7.2
television 7.1
information 7.1
decor 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 99.4
outdoor 88.2
person 81.6
clothing 79.7
vehicle 74.8
car 63.1
man 62.9
land vehicle 55.3

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Male, 99.9%
Calm 81.5%
Surprised 7.5%
Happy 5.1%
Disgusted 1.4%
Confused 1.3%
Sad 1.3%
Fear 1%
Angry 0.9%

AWS Rekognition

Age 35-43
Gender Male, 99.6%
Happy 77.1%
Calm 21.8%
Surprised 0.6%
Confused 0.2%
Disgusted 0.1%
Angry 0.1%
Sad 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Car 98.5%

Captions

Microsoft

a woman standing in front of a building 77.8%
a woman is standing in front of a building 66.5%
a woman that is standing in front of a building 66.4%

Text analysis

Amazon

QUEEN
FILTER QUEEN
FILTER
REMINGTON
LOUIS
MJI7
SU LOUIS
DISFA
MJI7 YT3RAS 002ИА
SU
YT3RAS
rasp DISFA
002ИА
rasp

Google

MJI7
MJI7 YT3RA REMINGTON POST-DISPA FILTER QUEEN
YT3RA
QUEEN
REMINGTON
POST-DISPA
FILTER