Human Generated Data

Title

Untitled (three children reading magazines)

Date

c. 1960

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17235

Human Generated Data

Title

Untitled (three children reading magazines)

People

Artist: Lucian and Mary Brown, American

Date

c. 1960

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Furniture 99.5
Human 98.4
Person 98.4
Table 98.1
Person 97.3
Chair 96.2
Person 96.1
Person 94.6
Sitting 90.1
Indoors 89.5
Dining Table 88.9
Room 87
Living Room 84.9
Clothing 79.8
Apparel 79.8
Couch 76.7
Plant 76.1
Coffee Table 69.7
Face 69
Text 68.4
Shorts 65.7
Female 63.8
Child 62.5
Kid 62.5
Photography 60.9
Photo 60.9
Desk 60.2
Girl 57.6
Dining Room 56.3
Floor 55.7
Advertisement 55.4
Poster 55.4

Imagga
created on 2022-02-26

laptop 45.3
man 42.4
home 39.9
person 36.9
male 36.2
senior 33.8
room 33.7
people 32.4
computer 32.2
classroom 31
sitting 30.9
couple 30.5
adult 30
teacher 28.5
smiling 27.5
happy 26.9
indoors 26.4
together 26.3
table 26.1
elderly 23.9
retired 21.3
office 21.2
mature 20.5
retirement 20.2
technology 20
working 19.4
old 18.8
lifestyle 18.8
women 18.2
professional 18.2
meeting 17.9
grandfather 17.8
group 17.7
portrait 17.5
men 17.2
work 16.6
business 16.4
educator 16.3
team 16.1
family 16
smile 15.7
husband 15.3
businessman 15
teamwork 14.8
businesswoman 14.5
looking 14.4
desk 14.2
newspaper 14.1
school 13.5
wife 13.3
relaxed 13.1
education 13
indoor 12.8
casual 12.7
discussion 12.7
modern 12.6
cheerful 12.2
house 11.7
older 11.7
reading 11.4
talking 11.4
happiness 11
student 10.8
notebook 10.7
using 10.6
friends 10.3
two 10.2
horizontal 10.1
chair 9.9
product 9.9
living room 9.8
book 9.8
lady 9.7
job 9.7
interior 9.7
colleagues 9.7
couch 9.7
success 9.7
boy 9.6
togetherness 9.4
executive 9.4
learning 9.4
grandma 9.4
communication 9.2
60s 8.8
class 8.7
child 8.7
corporate 8.6
businesspeople 8.5
living 8.5
face 8.5
clothing 8.3
leisure 8.3
scholar 8.3
children 8.2
relaxing 8.2
aged 8.1
kid 8
to 8
love 7.9
browsing 7.9
students 7.8
color 7.8
two people 7.8
creation 7.7
studying 7.7
sofa 7.7
wireless 7.6
presentation 7.4
glasses 7.4
pensioner 7.3
successful 7.3
suit 7.2
handsome 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 99
table 91.4
person 90.8
indoor 88.4
clothing 83.9
computer 76.2
furniture 73.3
man 71.6
drawing 68.4
library 64.7
human face 60.3
laptop 55.5
black and white 52
desk 7

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Male, 98.2%
Surprised 87.2%
Happy 4.2%
Calm 3.6%
Sad 2.6%
Fear 0.7%
Disgusted 0.7%
Angry 0.6%
Confused 0.4%

AWS Rekognition

Age 20-28
Gender Male, 56%
Calm 38.4%
Sad 35.2%
Angry 16.1%
Disgusted 3.1%
Confused 2.3%
Fear 2.1%
Happy 1.8%
Surprised 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.4%

Captions

Microsoft

a group of people sitting at a table 83.3%
a group of people sitting at a table in front of a laptop 77.1%
a group of people sitting at a table with a laptop 77%

Text analysis

Amazon

2
PRESLEY
TENDEN
EURO
000

Google

2
2