Human Generated Data

Title

Untitled (men sitting at table next to padded wall)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16102.3

Human Generated Data

Title

Untitled (men sitting at table next to padded wall)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Human 99.2
Person 99.2
Person 98.9
Furniture 98.6
Person 98.3
Person 96
Table 94.8
Chair 89.5
Restaurant 86.8
Dining Table 86.4
Indoors 82.3
Building 81.4
Housing 81.4
Pub 80.2
Bar Counter 78.7
Wood 73.6
Person 70.6
Cafe 69
Room 66.8
Sitting 62.4
Plywood 56.9
Cafeteria 55.9

Imagga
created on 2022-02-11

musical instrument 20.1
people 19.5
man 18.8
person 18.5
silhouette 16.5
music 16.4
dark 15.9
device 14.6
night 14.2
male 13.5
percussion instrument 13.3
stretcher 13.2
chair 12.7
light 12.6
sexy 12
body 12
passion 11.3
hot 10.9
adult 10.7
litter 10.5
attractive 10.5
stage 10.4
black 10.2
model 10.1
sensual 10
conveyance 9.6
water 9.3
life 9
crowd 8.6
performance 8.6
party 8.6
microphone 8.6
men 8.6
equipment 8.5
club 8.5
entertainment 8.3
window 8.2
disco 8.2
sunset 8.1
group 8.1
interior 8
electronic instrument 7.9
room 7.9
work 7.8
performer 7.8
concert 7.8
portrait 7.8
dancing 7.7
pretty 7.7
sky 7.7
dance 7.6
fashion 7.5
human 7.5
fire 7.5
one 7.5
lights 7.4
suit 7.4
inside 7.4
occupation 7.3
building 7.3
business 7.3
protection 7.3
sensuality 7.3
metal 7.2
shadow 7.2
hair 7.1
posing 7.1
love 7.1

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

person 87.7
dark 71.8
text 69
table 53.6

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Male, 97%
Calm 98.5%
Angry 0.9%
Sad 0.3%
Surprised 0.1%
Happy 0.1%
Confused 0.1%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 37-45
Gender Male, 99.8%
Angry 51%
Calm 31.8%
Confused 6.8%
Sad 3.8%
Disgusted 2.5%
Happy 1.5%
Surprised 1.4%
Fear 1.3%

AWS Rekognition

Age 29-39
Gender Female, 82.4%
Sad 89.7%
Fear 4.1%
Calm 3.4%
Happy 0.9%
Disgusted 0.7%
Angry 0.5%
Surprised 0.3%
Confused 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a group of people in a dark room 95.4%
a group of people sitting in a dark room 92.5%
a group of people sitting in chairs in a dark room 91.9%

Text analysis

Amazon

4
C
MIID

Google

M VT3 2
M
VT3
2