Human Generated Data

Title

Untitled (party guests sitting on chairs)

Date

c. 1950

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19585

Human Generated Data

Title

Untitled (party guests sitting on chairs)

People

Artist: Samuel Cooper, American active 1950s

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19585

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Furniture 99.9
Chair 99.6
Person 99
Human 99
Person 98.7
Person 95.6
Person 93.8
Hat 92.8
Clothing 92.8
Apparel 92.8
Person 91.2
Restaurant 90.6
Person 87
Sitting 84.9
Cafeteria 80.3
Table 78.2
Indoors 75.3
Room 74.2
Female 70.4
Cafe 69.6
Person 69.1
Crowd 66.7
Chair 65.1
People 64.3
Meal 63.3
Food 63.3
Dining Table 62.9
Person 62.3
Girl 60
Face 59.5
Leisure Activities 57

Clarifai
created on 2023-10-22

people 99.4
monochrome 97.3
group together 97.3
woman 96.8
adult 95.8
man 95
group 94.5
chair 93.7
wear 91.2
furniture 90.1
sitting 89.1
recreation 87.7
indoors 85.4
sit 82.3
music 79
enjoyment 77.7
outfit 76.7
audience 76.3
seat 72.2
actor 72

Imagga
created on 2022-03-05

musical instrument 62.9
accordion 55.5
keyboard instrument 44.4
wind instrument 43.5
man 35.6
people 33.4
business 27.9
person 27.7
businessman 24.7
office 24.2
computer 23.5
male 22.7
men 21.4
laptop 21.1
adult 19.6
communication 18.5
executive 18.1
meeting 16.9
chair 16.9
professional 16.9
work 16.7
indoors 16.7
bass 16.1
group 16.1
room 15.4
sitting 15.4
corporate 14.6
smiling 14.4
job 14.1
working 14.1
table 13.8
businesswoman 13.6
team 13.4
couple 13.1
employee 12.6
worker 12.6
handsome 12.5
happy 11.9
teacher 10.9
suit 10.8
salon 10.6
success 10.4
businesspeople 10.4
teamwork 10.2
indoor 10
modern 9.8
cheerful 9.7
interior 9.7
technology 9.6
home 9.6
desk 9.4
confident 9.1
student 8.7
lifestyle 8.7
education 8.7
career 8.5
screen 8.4
occupation 8.2
monitor 8.1
looking 8
women 7.9
together 7.9
photographer 7.8
attitude 7.8
casual 7.6
reading 7.6
holding 7.4
inside 7.4
furniture 7.2
smile 7.1
happiness 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Male, 99.9%
Sad 62%
Happy 17.7%
Calm 9.7%
Angry 2.7%
Fear 2.2%
Surprised 2.1%
Confused 2%
Disgusted 1.6%

AWS Rekognition

Age 41-49
Gender Male, 95.2%
Happy 59.9%
Calm 32.1%
Surprised 5.2%
Disgusted 0.7%
Sad 0.7%
Angry 0.6%
Confused 0.4%
Fear 0.3%

AWS Rekognition

Age 38-46
Gender Male, 87.6%
Happy 96.3%
Calm 1.6%
Sad 0.9%
Surprised 0.4%
Fear 0.3%
Angry 0.2%
Disgusted 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair
Person
Hat
Chair 99.6%
Chair 65.1%
Person 99%
Person 98.7%
Person 95.6%
Person 93.8%
Person 91.2%
Person 87%
Person 69.1%
Person 62.3%
Hat 92.8%

Text analysis

Amazon

T
SUBERGAN
DUSCO SUBERGAN
DUSCO