Human Generated Data

Title

Untitled (looking at vaccum cleaner window display)

Date

1948, printed later

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.225

Human Generated Data

Title

Untitled (looking at vaccum cleaner window display)

People

Artist: Jack Gould, American

Date

1948, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.225

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.5
Human 99.5
Person 99.5
Person 98.1
Clothing 96.1
Apparel 96.1
Person 95.3
Helmet 94.4
Officer 92.8
Military 92.8
Military Uniform 92.8
Captain 57.5
Wheel 57.1
Machine 57.1
Overcoat 56.1
Coat 56.1

Clarifai
created on 2023-10-25

people 99.9
group together 97.5
adult 96.8
man 96.4
vehicle 95.6
transportation system 93.6
three 92.9
uniform 90.2
veil 89.7
monochrome 88.5
two 87.5
group 87.5
outfit 87.4
gas station 84.8
wear 83
sports equipment 79.6
four 79
lid 76.6
several 76.2
police 75.4

Imagga
created on 2021-12-14

brass 77.1
wind instrument 52.8
bass 47.3
musical instrument 37.8
trombone 34.4
man 22.8
music 22.2
horn 21.7
device 19.8
instrument 17.3
musician 17
male 17
people 16.2
person 15.2
cornet 14.6
sound 13.9
black 13.8
adult 13.6
style 13.3
play 12.9
concert 12.6
musical 12.4
business 11.5
equipment 11.4
studio 11.4
playing 10.9
work 10.2
hand 9.9
gold 9.9
old 9.7
metal 9.6
office 9.6
rock 9.5
instrumentality 9.4
baritone 9.2
modern 9.1
sax 9.1
jazz 8.8
lifestyle 8.7
life 8.6
technology 8.2
looking 8
job 8
businessman 7.9
glass 7.8
band 7.8
record 7.7
party 7.7
professional 7.6
communication 7.5
player 7.5
fashion 7.5
guitar 7.4
entertainment 7.4
light 7.3
occupation 7.3
home 7.2
kitchen 7.1
portrait 7.1
worker 7.1
medicine 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 99.7
person 98.6
clothing 93.2
man 87.1
musical instrument 83.9
black and white 74.4
drum 69.2
people 61.9
hat 50.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 32-48
Gender Male, 99.5%
Calm 65.2%
Happy 28.6%
Disgusted 2.3%
Surprised 1.2%
Angry 1.1%
Confused 0.6%
Fear 0.5%
Sad 0.5%

AWS Rekognition

Age 23-35
Gender Male, 99%
Calm 98.6%
Happy 0.7%
Angry 0.4%
Sad 0.2%
Surprised 0.1%
Fear 0.1%
Confused 0%
Disgusted 0%

AWS Rekognition

Age 44-62
Gender Male, 96.5%
Calm 89.4%
Sad 2.8%
Surprised 2.6%
Fear 1.9%
Angry 1.2%
Happy 1.1%
Confused 0.6%
Disgusted 0.4%

AWS Rekognition

Age 36-54
Gender Male, 81.1%
Disgusted 94.1%
Calm 2.1%
Angry 0.9%
Surprised 0.9%
Sad 0.7%
Fear 0.5%
Confused 0.4%
Happy 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Helmet 94.4%
Wheel 57.1%

Categories

Imagga

paintings art 96.6%
people portraits 3.1%

Text analysis

Amazon

REMINGTO
QUEEN

Google

REMINGTO
REMINGTO