Human Generated Data

Title

Untitled (woman and girl looking into shop window at mysterious vacuum display)

Date

1948

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14682

Human Generated Data

Title

Untitled (woman and girl looking into shop window at mysterious vacuum display)

People

Artist: Jack Gould, American

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14682

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.5
Human 99.5
Person 98.9
Person 94.3
Clothing 89.6
Apparel 89.6
Bus 89.5
Transportation 89.5
Vehicle 89.5
Appliance 87.2
Shorts 68
Train 64.1
Home Decor 59
Room 58.5
Indoors 58.5
Furniture 56.5
Portrait 56
Photography 56
Face 56
Photo 56
Female 55

Clarifai
created on 2023-10-27

people 99.7
vehicle 97.8
adult 97.7
transportation system 95.7
man 95.3
two 92.9
watercraft 92.5
wear 92.2
monochrome 90.5
three 89.7
group together 89.7
outfit 89.5
administration 88.7
woman 86.7
group 83
veil 80.9
nostalgia 76.7
aircraft 76.4
facial expression 76
one 75.9

Imagga
created on 2022-01-29

radio 31.5
vacuum 23.9
device 21.6
equipment 20.9
communication system 19.7
interior 17.7
home appliance 16.6
appliance 14.9
home 14.3
clean 13.3
microphone 12.7
water 12.7
bathroom 11.9
house 11.7
old 11.1
adult 11
machine 10.8
pump 10.7
wash 10.6
person 10.5
modern 10.5
sink 9.9
hand 9.9
kitchen 9.8
medical 9.7
technology 9.6
wall 9.4
working 8.8
indoors 8.8
building 8.7
chemical 8.7
shower 8.7
exercise bike 8.6
cleaner 8.6
industry 8.5
energy 8.4
floor 8.4
human 8.2
one 8.2
room 7.9
urban 7.9
face 7.8
people 7.8
health 7.6
bath 7.6
power 7.5
retro 7.4
antique 7.3
indoor 7.3
male 7.1
medicine 7

Microsoft
created on 2022-01-29

text 99.3
black and white 77.7
clothing 72.4
person 68.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Male, 99.4%
Calm 98.3%
Surprised 0.8%
Sad 0.3%
Happy 0.2%
Angry 0.2%
Disgusted 0.1%
Confused 0.1%
Fear 0%

Feature analysis

Amazon

Person
Bus
Train
Person 99.5%

Categories

Imagga

interior objects 99.5%

Text analysis

Amazon

1745
QUEEN
REMINGTON
MJI7
WITH QUEEN
MJI7 YT33AS 002ИА
YT33AS
002ИА
WITH

Google

MJI7 YT3RA REMINGTON 1745 LTER QUEE
MJI7
YT3RA
REMINGTON
1745
LTER
QUEE