Human Generated Data

Title

Untitled (three women sitting at table)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16199

Human Generated Data

Title

Untitled (three women sitting at table)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16199

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Architecture 100
Building 100
Dining Room 100
Dining Table 100
Furniture 100
Indoors 100
Room 100
Table 100
Restaurant 99.9
Tablecloth 99.5
Adult 98.1
Male 98.1
Man 98.1
Person 98.1
Person 98
Person 97.9
Food 96.9
Food Court 96.9
Face 91.2
Head 91.2
Meal 90.8
Home Decor 88.6
Linen 88.6
Clothing 83.5
Formal Wear 83.5
Suit 83.5
Cafeteria 81.6
Sitting 71.9
Chair 63.3
Glass 56.9
Cutlery 56.4
Dinner 56.2
Lunch 55.9
Dish 55.9
Diner 55.8
Beverage 55.6
Shirt 55.3
Alcohol 55.3

Clarifai
created on 2018-08-23

people 99.1
man 96.7
adult 95
chair 94.2
table 93.4
woman 92.9
group together 92.5
furniture 92
group 91.8
restaurant 90.7
room 86.5
actor 85.5
several 85.2
sit 83.3
leader 81.3
portrait 81.2
monochrome 80.4
military 80.3
indoors 79.8
war 78.2

Imagga
created on 2018-08-23

office 54.8
businessman 50.3
business 49.2
computer 46.4
laptop 45.4
corporate 41.2
man 40.3
desk 33.5
male 33.3
professional 32.4
executive 32.4
businesswoman 31.8
people 31.2
meeting 31.1
adult 30.3
work 29.8
working 29.2
person 29.1
sitting 28.3
businesspeople 27.5
table 26.3
job 25.6
group 25
team 24.2
looking 24
handsome 23.2
communication 22.7
happy 21.9
room 21.7
worker 21.3
manager 20.5
teamwork 20.4
suit 19.7
smiling 19.5
center 19.1
career 18.9
men 18.9
jacket 18.5
confident 18.2
attractive 18.2
women 18.2
happiness 18
together 17.5
indoor 17.3
casual 16.9
modern 16.8
indoors 16.7
corporation 16.4
employee 16.3
success 16.1
lifestyle 15.9
company 15.8
notebook 15.7
smile 15.7
discussion 15.6
partners 14.6
colleagues 14.6
conference 13.7
home 12.8
consultant 12.7
technology 12.6
couch 12.6
boss 12.4
talking 12.4
tie 12.3
portrait 12.3
couple 12.2
cheerful 12.2
face 12.1
document 12.1
glasses 12
successful 11.9
partnership 11.5
expression 11.1
horizontal 10.9
cooperation 10.6
color 10.6
formal 10.5
television 10.3
keyboard 10.3
relaxed 10.3
paper 10.2
two 10.2
30s 9.6
females 9.5
camera 9.3
coffee 9.3
friendly 9.1
hand 9.1
one 9
debate 8.9
associates 8.8
coworkers 8.8
monitor 8.8
25 30 years 8.8
conversation 8.7
busy 8.7
workplace 8.6
electronic equipment 8.6
20s 8.2
sofa 8.1
collaboration 7.9
education 7.8
call 7.8
leader 7.7
pretty 7.7
sales 7.7
leadership 7.7
serious 7.6
finance 7.6
smart 7.5
shirt 7.5
presentation 7.4
holding 7.4
occupation 7.3
alone 7.3
interior 7.1

Google
created on 2018-08-23

Microsoft
created on 2018-08-23

indoor 96.5
person 96.1
sitting 95.1
people 74.1
group 57.7
dining table 6.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 20-28
Gender Female, 100%
Happy 75.5%
Sad 10%
Surprised 7.3%
Fear 6.5%
Calm 6.4%
Angry 1.4%
Disgusted 1.1%
Confused 0.7%

AWS Rekognition

Age 37-45
Gender Female, 99.9%
Calm 75.5%
Happy 9.7%
Surprised 8.2%
Fear 6.6%
Angry 3.4%
Confused 2.8%
Sad 2.8%
Disgusted 1.4%

AWS Rekognition

Age 18-26
Gender Female, 99.8%
Happy 96.2%
Surprised 6.6%
Fear 6.1%
Sad 2.3%
Disgusted 0.8%
Calm 0.8%
Angry 0.3%
Confused 0.1%

Microsoft Cognitive Services

Age 32
Gender Female

Microsoft Cognitive Services

Age 34
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 98.1%
Male 98.1%
Man 98.1%
Person 98.1%
Suit 83.5%
Chair 63.3%