Human Generated Data

Title

Untitled (people eating at party)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17203

Human Generated Data

Title

Untitled (people eating at party)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17203

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 98.7
Human 98.7
Chair 98.4
Furniture 98.4
Restaurant 98.3
Person 97.7
Person 97.4
Person 96.7
Person 94.6
Sitting 92.5
Person 91
Person 89.5
Cafeteria 83.7
Person 81.1
Meal 80.8
Food 80.8
Cafe 79.7
Person 72.3
Chair 71.9
People 67.9
Dish 67.2
Crowd 64.8
Senior Citizen 60.9
Indoors 60
Photography 59.4
Photo 59.4
Couch 58.6
Suit 58.5
Clothing 58.5
Coat 58.5
Overcoat 58.5
Apparel 58.5
Portrait 58.4
Face 58.4
Food Court 56.2
Person 55.3
Person 45

Clarifai
created on 2023-10-29

people 99.5
furniture 97.4
group 97
chair 96.5
education 96.4
many 96.3
room 95.9
man 95.8
indoors 95.5
classroom 95.3
school 95.3
adult 94.5
group together 93.9
sit 92.8
woman 92.5
table 90.9
teacher 89.6
seat 89.2
monochrome 89.1
leader 84.2

Imagga
created on 2022-02-26

table 32.1
person 31.8
man 31.6
room 31.1
people 30.1
male 27.7
restaurant 26.1
meeting 24.5
adult 24.2
indoors 23.7
office 23.1
executive 22.6
businessman 22.1
teacher 21.6
home 21.5
classroom 21.2
entrepreneur 21.1
business 20.6
senior 20.6
men 20.6
interior 20.3
chair 20.1
group 19.3
couple 19.2
professional 18.8
together 18.4
modern 18.2
sitting 18
indoor 17.3
work 17.3
computer 16.1
smiling 15.2
communication 15.1
happy 15
salon 14.9
mature 14.9
cafeteria 14.5
women 14.2
team 13.4
desk 13.2
lifestyle 13
corporate 12.9
businesswoman 12.7
drink 12.5
talking 12.4
hall 12.3
laptop 12.1
worker 12
inside 12
conference 11.7
student 11.5
businesspeople 11.4
education 11.3
looking 11.2
teamwork 11.1
suit 10.8
dinner 10.6
job 10.6
educator 10.6
elderly 10.5
building 10.5
portrait 10.3
lunch 10
smile 10
handsome 9.8
colleagues 9.7
design 9.6
happiness 9.4
glass 9.3
presentation 9.3
holding 9.1
class 8.7
party 8.6
clothing 8.5
two 8.5
learning 8.4
study 8.4
manager 8.4
occupation 8.2
food 8
furniture 7.8
teaching 7.8
40s 7.8
discussion 7.8
leader 7.7
retirement 7.7
old 7.7
speaker 7.6
dining 7.6
hand 7.6
enjoying 7.6
floor 7.4
technology 7.4
wine 7.4
board 7.4
cheerful 7.3
success 7.2
life 7.2
kitchen 7.2
love 7.1
working 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

table 95.8
text 95.4
person 87.4
vase 77.5
tableware 72.9
furniture 71.7
christmas tree 71.2
house 70.5
wine glass 61.5
people 58.8
chair 56.9
clothing 56.7
dining table 6.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 98%
Sad 66.4%
Calm 27.8%
Angry 2.2%
Happy 1.3%
Surprised 0.7%
Disgusted 0.6%
Fear 0.6%
Confused 0.4%

AWS Rekognition

Age 26-36
Gender Female, 93.5%
Calm 99.2%
Sad 0.5%
Surprised 0.1%
Angry 0.1%
Confused 0%
Disgusted 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 16-24
Gender Male, 87.4%
Sad 87.1%
Calm 8.8%
Happy 1.9%
Confused 0.9%
Angry 0.5%
Disgusted 0.4%
Surprised 0.4%
Fear 0.1%

AWS Rekognition

Age 33-41
Gender Male, 93%
Calm 99.6%
Sad 0.2%
Confused 0.1%
Angry 0.1%
Disgusted 0%
Surprised 0%
Happy 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Person 98.7%
Person 97.7%
Person 97.4%
Person 96.7%
Person 94.6%
Person 91%
Person 89.5%
Person 81.1%
Person 72.3%
Person 55.3%
Person 45%
Chair 98.4%
Chair 71.9%

Categories

Text analysis

Amazon

KODAK-SEL