Human Generated Data

Title

Untitled (men looking at vacuum display inside shop window)

Date

1948

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14551

Human Generated Data

Title

Untitled (men looking at vacuum display inside shop window)

People

Artist: Jack Gould, American

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14551

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.6
Human 99.6
Person 99.6
Person 99.2
Person 98.9
Clothing 98.5
Apparel 98.5
Appliance 81.8
Metropolis 77.3
Building 77.3
Urban 77.3
City 77.3
Town 77.3
Overcoat 67.9
Coat 67.9
Dish 58.5
Food 58.5
Meal 58.5
Suit 58.5
Hat 56.9
Steamer 56.9

Clarifai
created on 2023-10-27

people 99.9
group together 98.9
adult 98.7
vehicle 97.9
monochrome 97.7
group 96.3
man 96.2
outfit 95.9
lid 94.1
uniform 93.9
several 93.5
administration 93.3
wear 92.2
veil 91.6
three 90.3
four 89
military 88.3
transportation system 87.3
leader 86.8
watercraft 85.9

Imagga
created on 2022-01-29

brass 27.5
glass 24.3
device 21.8
wind instrument 21.1
interior 17.7
kitchen 16.3
home 15.9
musical instrument 15.8
equipment 15.6
food 15.2
table 14.8
dinner 14.5
trombone 14.3
room 14
technology 13.3
modern 12.6
case 12
plate 11.8
people 11.7
drink 11.7
decoration 11.6
medical 11.5
luxury 11.1
industry 11.1
machine 10.8
man 10.7
medicine 10.6
party 10.3
work 10.3
restaurant 10
house 10
clean 10
business 9.7
metal 9.6
setting 9.6
wine 9.2
celebration 8.8
research 8.6
gramophone 8.4
person 8.4
service 8.3
alcohol 8.3
inside 8.3
occupation 8.2
silver 7.9
indoors 7.9
life 7.8
napkin 7.8
male 7.8
adult 7.8
record player 7.8
waiting 7.7
meal 7.7
lunch 7.7
communication 7.5
fashion 7.5
place 7.4
glasses 7.4
design 7.3
steel 7.1
working 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 99.4
clothing 89.6
person 86.9
musical instrument 81.7
drum 66.4
man 61.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Male, 98.9%
Calm 94.2%
Sad 1.6%
Disgusted 0.9%
Angry 0.9%
Confused 0.8%
Surprised 0.7%
Happy 0.6%
Fear 0.2%

Feature analysis

Amazon

Person
Person 99.6%

Categories

Imagga

paintings art 99.8%

Text analysis

Amazon

QUEEN
REMINGTON
CHR
MJI7
MJI7 YESTAD
DISPA
YESTAD
LOURS
DE

Google

MJI7 YT3RA REMINGTON
MJI7
YT3RA
REMINGTON