Human Generated Data

Title

Untitled (men sitting at table next to padded wall)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16102.2

Human Generated Data

Title

Untitled (men sitting at table next to padded wall)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Person 99.1
Human 99.1
Person 99
Furniture 98.7
Person 98.5
Table 96.1
Person 91.4
Person 89.5
Dining Table 88.2
Chair 74.7
Bar Counter 69.3
Pub 69.3
Indoors 63.8
Desk 61.2
Wood 59.9
Restaurant 59.9
Cafe 56.3
Clothing 55.6
Apparel 55.6
Room 55.2

Imagga
created on 2022-02-11

musical instrument 21.7
person 20.7
silhouette 19.9
dark 18.4
people 17.8
night 16
man 15.5
music 14.5
black 13.9
stringed instrument 13.6
light 13.4
percussion instrument 12.7
male 11.3
passion 11.3
hot 10.9
device 10.7
sexy 10.4
body 10.4
adult 10
sunset 9.9
chair 9.8
bowed stringed instrument 9.5
model 9.3
window 9.2
sensual 9.1
equipment 8.9
concert 8.7
building 8.7
love 8.7
stage 8.7
attractive 8.4
fashion 8.3
sky 8.3
sensuality 8.2
laptop 8.1
shadow 8.1
water 8
scholar 8
business 7.9
holiday 7.9
room 7.7
pretty 7.7
fire 7.5
fun 7.5
life 7.5
one 7.5
smoke 7.4
style 7.4
piano 7.4
center 7.4
protection 7.3
group 7.3
metal 7.2
romance 7.1
hair 7.1
portrait 7.1
interior 7.1
work 7.1
architecture 7
indoors 7

Google
created on 2022-02-11

Table 91.7
Automotive design 86.7
Chair 85.6
Wheel 78
Desk 75.9
Event 73
Darkness 66.6
Tire 65.8
Visual arts 65
Room 63.4
Conversation 62.5
Luxury vehicle 60.5
Suit 60.1
Font 59.7
Recreation 58.9
Sitting 57.3
Job 57
Employment 54.8
Airplane 54.3
Aerospace engineering 50.7

Microsoft
created on 2022-02-11

person 95.3
indoor 91.2
man 80.4
clothing 80
text 73.3
dark 60.2

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 39-47
Gender Male, 98.4%
Calm 74.8%
Angry 15.8%
Sad 6.3%
Fear 0.7%
Surprised 0.7%
Confused 0.6%
Happy 0.5%
Disgusted 0.5%

AWS Rekognition

Age 41-49
Gender Male, 99.7%
Calm 48.5%
Angry 31.5%
Confused 6.6%
Sad 4.5%
Disgusted 2.7%
Happy 2.6%
Surprised 2%
Fear 1.6%

Microsoft Cognitive Services

Age 48
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft

a group of people sitting in a dark room 91.1%
a group of people sitting in chairs in a dark room 91%
a person sitting in a dark room 90%

Text analysis

Amazon

5
MAC
i
....
20
20 .... EIRN
EIRN
EIIA

Google

MJI3
MJI3