Human Generated Data

Title

Untitled (men at banquet table holding oversized pretzel)

Date

1950

People

Artist: Jean Raeburn, American active 1950s

Artist: Lester Cole, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19782

Human Generated Data

Title

Untitled (men at banquet table holding oversized pretzel)

People

Artist: Jean Raeburn, American active 1950s

Artist: Lester Cole, American

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19782

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.2
Human 99.2
Person 98.8
Person 98.8
Person 98.8
Person 95.6
Person 95.1
Tie 94.8
Accessories 94.8
Accessory 94.8
Clothing 92.2
Apparel 92.2
Suit 85.4
Overcoat 85.4
Coat 85.4
Sitting 77.5
Sleeve 72.2
Meal 69.4
Food 69.4
Crowd 67.9
Cafeteria 67.5
Restaurant 67.5
Long Sleeve 66
Shirt 60.4
People 60.2
Finger 58.7

Clarifai
created on 2023-10-22

people 99.9
man 98.5
adult 97.9
group 93.5
group together 93
actor 91.8
music 91
administration 90.4
monochrome 89.9
woman 88.3
portrait 88.1
one 86.9
furniture 86.1
room 83.8
wear 82.9
grinder 81.4
leader 80.3
two 80
military 79.9
chair 77.8

Imagga
created on 2022-03-05

device 63.1
ventilator 38
electric fan 32.6
fan 26.8
machine 19.5
man 18.8
musical instrument 17.7
brass 16.5
equipment 13.5
percussion instrument 12.9
metal 12.9
gong 12.6
worker 12.4
person 12.3
male 12
old 11.8
wind instrument 11.8
black 11.4
travel 11.3
technology 11.1
industry 11.1
business 10.9
iron lung 10.7
work 10.7
light 10
industrial 10
steel 9.9
mask 9.7
computerized axial tomography scanner 9.6
gramophone 9.5
power 9.2
adult 9.1
human 9
people 8.9
gear 8.8
men 8.6
respirator 8.6
engineering 8.6
art 8.5
job 8
businessman 7.9
wheel 7.6
record player 7.6
iron 7.5
object 7.3
history 7.1
idea 7.1
science 7.1
architecture 7

Google
created on 2022-03-05

Table 90.3
Style 83.8
Black-and-white 83.5
Suit 80.2
Monochrome photography 74.7
Event 73.9
Monochrome 73.6
Tableware 72.6
Serveware 64.2
Room 64
Stock photography 63.6
Hat 63.1
History 62.6
Circle 61.4
Vintage clothing 60.6
Plate 57.2
Art 56.6
Curtain 56.3
Sitting 55.8
Chair 54.4

Microsoft
created on 2022-03-05

window 93.5
person 90.8
indoor 88.1
black and white 74.9
people 62.7
table 60

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Male, 71.7%
Sad 89%
Surprised 4.7%
Calm 3.3%
Confused 1.6%
Disgusted 0.5%
Fear 0.4%
Happy 0.3%
Angry 0.2%

AWS Rekognition

Age 29-39
Gender Male, 100%
Calm 84.7%
Confused 6.1%
Surprised 2.9%
Angry 2.2%
Sad 1.4%
Disgusted 1.4%
Happy 1%
Fear 0.3%

AWS Rekognition

Age 48-54
Gender Male, 100%
Happy 70.6%
Calm 19.1%
Sad 4.6%
Surprised 2.1%
Confused 1.8%
Disgusted 0.9%
Angry 0.8%
Fear 0.2%

AWS Rekognition

Age 39-47
Gender Male, 82.3%
Confused 48.3%
Happy 22.7%
Sad 14.2%
Calm 6.9%
Fear 4.2%
Surprised 1.6%
Disgusted 1.5%
Angry 0.5%

AWS Rekognition

Age 51-59
Gender Male, 97.1%
Sad 97.8%
Calm 1.2%
Confused 0.7%
Surprised 0.1%
Happy 0.1%
Angry 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 34-42
Gender Male, 91.9%
Calm 59.3%
Sad 12.2%
Angry 8.5%
Surprised 6.2%
Disgusted 5.3%
Happy 3.3%
Fear 3.3%
Confused 2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Tie
Suit
Person 99.2%
Person 98.8%
Person 98.8%
Person 98.8%
Person 95.6%
Person 95.1%
Tie 94.8%
Suit 85.4%

Text analysis

Amazon

TO
MJIR
MJIR YT3RAS ACHMA
YT3RAS
ACHMA

Google

MJI7 YT3R
MJI7
YT3R