Human Generated Data

Title

Untitled (men looking at drawings on drafting table)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16545

Human Generated Data

Title

Untitled (men looking at drawings on drafting table)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-12

Human 99.2
Person 99.2
Person 98.6
Nature 77.9
Photography 62.6
Photo 62.6
Person 62.1
Outdoors 61.1
Smoke 59
Leisure Activities 58.9
Crowd 58.8
Clinic 55.9

Imagga
created on 2022-02-12

person 37.2
man 34.9
male 31.2
specialist 29.3
people 29
adult 27.4
professional 21.2
medical 20.3
surgeon 20.1
portrait 18.8
senior 18.7
education 17.3
human 16.5
work 16.5
doctor 16
health 16
student 15.4
looking 15.2
science 15.1
happy 15
businessman 15
business 14.6
nurse 14.6
medicine 14.1
men 13.7
hand 13.7
team 13.4
elderly 13.4
worker 13.4
technology 13.4
laboratory 12.5
patient 12.3
face 12.1
equipment 12
hospital 11.8
old 11.8
holding 11.5
smile 11.4
mature 11.1
teacher 11
scientist 10.8
care 10.7
blackboard 10.7
lab 10.7
chemistry 10.6
working 10.6
retirement 10.6
modern 10.5
group 10.5
mask 10.3
glasses 10.2
smiling 10.1
suit 10.1
clinic 9.7
classroom 9.7
retired 9.7
scientific 9.7
test 9.6
research 9.5
casual 9.3
teamwork 9.3
room 9.2
pensioner 9.1
black 9
chemical 8.8
hair 8.7
women 8.7
sitting 8.6
clothing 8.5
executive 8.5
entrepreneur 8.2
lady 8.1
case 8.1
spectator 8.1
computer 8
handsome 8
world 8
lifestyle 7.9
surgery 7.8
emergency 7.7
class 7.7
profession 7.7
illness 7.6
biology 7.6
head 7.6
instrument 7.5
manager 7.4
laptop 7.4
occupation 7.3
grandma 7.2
office 7.2
school 7.2
together 7
look 7

Google
created on 2022-02-12

Microsoft
created on 2022-02-12

person 99.2
clothing 97.2
man 96.9
text 96.2
human face 72.9
black and white 62.8

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Male, 71%
Calm 91.1%
Sad 6.7%
Happy 1%
Fear 0.6%
Angry 0.2%
Surprised 0.2%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 36-44
Gender Male, 95.2%
Calm 100%
Happy 0%
Sad 0%
Surprised 0%
Disgusted 0%
Angry 0%
Confused 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a man standing in front of a window 76.7%
a man standing next to a window 75.7%
a man and a woman standing in front of a window 55.3%

Text analysis

Amazon

TE
MJ17--YT3R- -A

Google

32
32