Human Generated Data

Title

Untitled (men sitting at table next to padded wall)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16102.1

Human Generated Data

Title

Untitled (men sitting at table next to padded wall)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Person 99.5
Human 99.5
Person 98.9
Person 97.8
Furniture 97.6
Table 95.3
Lighting 93.6
Restaurant 92.6
Dining Table 90.5
Chair 84.8
Bar Counter 79.8
Pub 79.8
Wood 78.7
Indoors 76.4
Interior Design 76.4
Room 75
Plywood 64.6
Food 63.5
Cafe 63.3
Meal 62.5
Food Court 59.6
Dining Room 56.6
Undershirt 56.5
Clothing 56.5
Apparel 56.5
Coat 55.5
Overcoat 55.5

Imagga
created on 2022-02-11

musical instrument 30.4
person 22.3
music 20.9
people 20.6
silhouette 19.9
sax 18.7
wind instrument 16.9
black 16.3
night 16
man 15.4
dark 15
adult 14
stringed instrument 13.8
concert 13.6
light 13.5
performance 13.4
performer 13.1
stage 13
musician 12.8
guitar 12.8
male 12.1
body 12
hot 11.7
dancing 11.6
dance 11.5
club 11.3
sexy 11.2
attractive 11.2
portrait 11
device 10.7
fashion 10.6
percussion instrument 10.5
disco 10.5
rock 10.4
passion 10.3
party 10.3
love 10.3
entertainment 10.1
model 10.1
couple 9.6
fire 9.4
bowed stringed instrument 9.3
art 9.2
brass 9.1
bass 9.1
sensual 9.1
sensuality 9.1
sunset 9
posing 8.9
singer 8.8
nightclub 8.8
crowd 8.6
motion 8.6
dancer 8.4
pretty 8.4
one 8.2
sitting 7.7
outfit 7.7
equipment 7.7
musical 7.7
erotic 7.6
elegance 7.6
human 7.5
sound 7.5
fun 7.5
smoke 7.4
style 7.4
event 7.4
makeup 7.3
chair 7.3
lady 7.3
life 7.3
group 7.3
metal 7.2
cornet 7.2
religion 7.2
shadow 7.2
romance 7.1
hair 7.1
women 7.1

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

person 94.3
indoor 88.5
man 78.5
dark 76.7
table 72.9
clothing 70.6
lit 55.3
light 55.1

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 37-45
Gender Male, 99.9%
Calm 86.8%
Angry 6.9%
Sad 3.1%
Surprised 0.8%
Confused 0.8%
Fear 0.7%
Happy 0.5%
Disgusted 0.4%

AWS Rekognition

Age 40-48
Gender Male, 99.9%
Angry 56.5%
Calm 31.1%
Confused 4.5%
Sad 2.9%
Disgusted 1.9%
Happy 1.2%
Surprised 1%
Fear 0.9%

Microsoft Cognitive Services

Age 42
Gender Male

Microsoft Cognitive Services

Age 45
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a group of people in a dark room 92.3%
a group of people sitting in a dark room 88.5%
a person sitting in a dark room 87.1%

Text analysis

Amazon

9
KODAK
KODAK EIRN
EIRN

Google

T
A°2
MJI3 Y T 37 A°2 MAGOM
Y
MAGOM
MJI3
37