Human Generated Data

Title

Untitled (two women having tea)

Date

1950

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20098

Human Generated Data

Title

Untitled (two women having tea)

People

Artist: Peter James Studio, American

Date

1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 99.7
Person 99.7
Person 99.5
Sitting 97.8
Furniture 89.7
Table 87.4
Restaurant 85.6
Tablecloth 82.1
Apparel 80.8
Clothing 80.8
Home Decor 80.2
Meal 79.9
Food 79.9
Dining Table 77.2
Tabletop 74.6
Suit 73.4
Coat 73.4
Overcoat 73.4
Dish 70.5
Female 60.6
Cafeteria 58.3
Chair 57.9
Face 57.6
Glass 56

Imagga
created on 2022-03-05

man 43.7
person 40.2
computer 36.2
office 36
laptop 35.7
male 35.5
people 35.2
adult 34
business 32.8
professional 32.1
businessman 31.8
lab coat 30.4
executive 29.5
desk 26.7
coat 26.6
sitting 25.8
worker 24.6
happy 24.5
businesswoman 23.6
work 23.6
table 23.5
smiling 21.7
corporate 21.5
indoors 21.1
businesspeople 20.9
working 20.3
home 20
teacher 19.9
group 19.4
room 19.1
meeting 18.9
job 18.6
indoor 18.3
men 18
portrait 17.5
suit 17.1
team 16.1
education 15.6
colleagues 15.6
looking 15.2
senior 15
casual 14.4
classroom 14.3
smile 14.3
face 14.2
modern 14
together 14
mature 14
teamwork 13.9
clothing 13.7
handsome 13.4
technology 13.4
waiter 13.3
couple 13.1
nurse 12.9
educator 12.4
employee 12.4
patient 12.2
successful 11.9
coworkers 11.8
happiness 11.8
conference 11.7
lifestyle 11.6
holding 11.6
specialist 11.5
talking 11.4
cheerful 11.4
attractive 11.2
manager 11.2
women 11.1
garment 11
occupation 11
student 10.9
horizontal 10.9
associates 10.8
businessperson 10.7
corporation 10.6
30s 10.6
workplace 10.5
bright 10
40s 9.7
medical 9.7
success 9.7
mid adult 9.7
color 9.5
doctor 9.4
camera 9.2
20s 9.2
dining-room attendant 9.1
confident 9.1
notebook 9
document 8.8
concentration 8.7
paper 8.6
elderly 8.6
serious 8.6
expression 8.5
adults 8.5
two 8.5
clothes 8.4
communication 8.4
focus 8.3
hospital 8.1
school 8.1
day 7.9
businessmen 7.8
using 7.7
retirement 7.7
old 7.7
profession 7.7
tie 7.6
plan 7.6
keyboard 7.5
company 7.4
glasses 7.4
inside 7.4
alone 7.3
clinic 7.2

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

black and white 92.6
indoor 90.7
text 90
table 90
person 86.3
window 83.8
clothing 83.7
man 68.9
human face 59.1

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Female, 95.7%
Calm 95.6%
Happy 2.5%
Sad 1%
Disgusted 0.4%
Fear 0.2%
Confused 0.1%
Angry 0.1%
Surprised 0.1%

AWS Rekognition

Age 41-49
Gender Female, 99.7%
Happy 91.7%
Calm 7.5%
Confused 0.2%
Surprised 0.2%
Disgusted 0.1%
Angry 0.1%
Fear 0.1%
Sad 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a person sitting at a table in front of a window 83.3%
a person sitting at a desk in front of a window 83.2%
a man and a woman sitting at a table in front of a window 71.3%

Text analysis

Amazon

YТЭА-AОX

Google

I YT37A2 AO
AO
I
YT37A2