Human Generated Data

Title

Untitled (boy and girl on chair)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17346

Human Generated Data

Title

Untitled (boy and girl on chair)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Furniture 100
Chair 99.9
Human 99.2
Person 99.2
Shelf 98.8
Bookcase 98
Person 94.6
Indoors 93.2
Room 92.9
Library 87.3
Book 87.3
Couch 77.3
Interior Design 70.3
Living Room 66.7
Reading 58
Armchair 55.6

Imagga
created on 2022-02-26

man 42.3
grandfather 41.4
male 37
people 32.9
sitting 30.9
home 29.5
adult 29.4
person 29.4
happy 25.7
smiling 24.6
couple 23.5
senior 22.5
working 22.1
indoors 22
men 21.5
together 21
casual 17.8
mature 17.7
talking 17.1
lifestyle 16.6
women 16.6
office 16.5
child 16
passenger 16
business 15.8
elderly 15.3
table 15
computer 14.5
chair 14.4
smile 14.2
family 14.2
laptop 14
room 13.9
work 13.4
businessman 13.2
cheerful 13
worker 12.7
retirement 12.5
day 11.8
discussion 11.7
color 11.7
husband 11.4
reading 11.4
wife 11.4
education 11.3
newspaper 11.1
communication 10.9
horizontal 10.9
discussing 10.8
two people 10.7
couch 10.6
sofa 10.5
old 10.4
looking 10.4
meeting 10.4
portrait 10.3
love 10.3
20s 10.1
relaxing 10
holding 9.9
professional 9.9
team 9.9
living room 9.8
retired 9.7
30s 9.6
father 9.5
adults 9.5
happiness 9.4
friends 9.4
house 9.2
back 9.2
outdoors 9
technology 8.9
group 8.9
job 8.8
40s 8.8
colleagues 8.7
patient 8.6
mother 8.6
loving 8.6
two 8.5
relaxed 8.4
baby 8.3
leisure 8.3
inside 8.3
occupation 8.2
book 8.2
fun 8.2
facing camera 7.9
casual clothing 7.8
desk 7.8
books 7.7
modern 7.7
businesspeople 7.6
enjoying 7.6
kin 7.5
friendship 7.5
help 7.4
teamwork 7.4
camera 7.4
indoor 7.3
teenager 7.3
neonate 7.3
aged 7.2
handsome 7.1
product 7.1
nurse 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

person 99.6
sitting 98.2
book 97.9
black and white 94
text 90.1
laptop 87.7
furniture 81.4

Face analysis

Amazon

Google

AWS Rekognition

Age 20-28
Gender Female, 61%
Happy 65.1%
Sad 12.4%
Calm 7.5%
Fear 6.3%
Surprised 3.2%
Disgusted 2.4%
Confused 1.8%
Angry 1.3%

AWS Rekognition

Age 42-50
Gender Female, 99.7%
Calm 82.3%
Happy 7.8%
Surprised 4.2%
Sad 1.8%
Angry 1.1%
Confused 1%
Disgusted 0.9%
Fear 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Couch 77.3%

Captions

Microsoft

a person sitting on a bench reading a book 74.4%
a group of people sitting in front of a laptop 74.3%
a person sitting on a bench in front of a laptop 74.2%

Text analysis

Amazon

1ST