Human Generated Data

Title

Untitled (two women sitting at table)

Date

1950

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20249

Human Generated Data

Title

Untitled (two women sitting at table)

People

Artist: Peter James Studio, American

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20249

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.7
Human 99.7
Person 99.6
Furniture 99.1
Table 98.3
Pottery 94.1
Dish 92.5
Meal 92.5
Food 92.5
Saucer 88.6
Restaurant 83.4
Chair 70.5
Person 70.1
Cup 69.6
Coffee Cup 66.8
Indoors 63.8
Dining Room 59.9
Room 59.9
Porcelain 58.3
Art 58.3
Photography 57.4
Photo 57.4
Dining Table 57.1
Vase 55.4
Jar 55.4
Bowl 55.2

Clarifai
created on 2023-10-22

people 99.6
adult 97.9
man 97
woman 96.1
two 95.4
group 94.8
monochrome 93.9
portrait 92.5
indoors 92
sit 91.4
table 90.9
three 89.9
restaurant 88.6
room 84.8
group together 84.7
hotel 84.6
furniture 84.4
four 82.1
wear 78
drink 77.4

Imagga
created on 2022-03-05

lab coat 50.3
coat 45.9
man 42.3
person 36.4
male 31.2
people 30.7
adult 24.3
medical 23.8
work 23.5
worker 22.5
working 22.1
professional 21.8
waiter 20.6
team 20.6
office 19.9
table 19.9
patient 19.7
doctor 19.7
happy 19.4
sitting 18.9
men 18.9
garment 18.6
business 18.2
smiling 18.1
businessman 17.6
indoors 17.6
home 17.5
clothing 16.8
30s 16.4
meeting 16
hospital 15.3
nurse 15.2
job 15
women 15
senior 15
room 14.1
employee 13.9
restaurant 13.8
scientist 13.7
businesswoman 13.6
talking 13.3
desk 13.2
together 13.1
couple 13.1
20s 12.8
dining-room attendant 12.6
health 12.5
medicine 12.3
food 12.2
teamwork 12
computer 12
portrait 11.6
kitchen 11.6
lifestyle 11.6
smile 11.4
businesspeople 11.4
clinic 11.3
group 11.3
looking 11.2
mature 11.2
two 11
occupation 11
indoor 10.9
holding 10.7
lab 10.7
colleagues 10.7
dinner 10.7
laboratory 10.6
cheerful 10.6
day 10.2
coffee 10.2
camera 10.2
drink 10
40s 9.7
to 9.7
assistant 9.7
standing 9.6
research 9.5
adults 9.5
casual 9.3
bright 9.3
hand 9.1
meal 9.1
suit 9
science 8.9
half length 8.8
mid adult 8.7
happiness 8.6
biology 8.5
face 8.5
student 8.4
horizontal 8.4
care 8.2
laptop 8.2
interior 8
four people 7.9
coworkers 7.9
chemical 7.8
education 7.8
color 7.8
middle aged 7.8
chemistry 7.7
daytime 7.7
modern 7.7
profession 7.7
executive 7.6
study 7.5
wine 7.4
emotion 7.4
microscope 7.4

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 98.8
person 96.8
tableware 96.3
black and white 95.7
indoor 92.6
clothing 79.9
table 78.6
man 73.8
monochrome 71.8
human face 64.3
bottle 63.8
old 51.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 49-57
Gender Female, 55%
Surprised 72.7%
Happy 25.2%
Fear 1.4%
Calm 0.2%
Disgusted 0.1%
Angry 0.1%
Sad 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Dining Table
Person 99.7%
Person 99.6%
Person 70.1%
Dining Table 57.1%

Text analysis

Amazon

are

Google

YT33A2- NAGO
YT33A2-
NAGO