Human Generated Data

Title

Untitled (boy at table playing with toy figures, looking down)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16957

Human Generated Data

Title

Untitled (boy at table playing with toy figures, looking down)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Human 99.5
Person 99.5
Chair 58.9
Furniture 58.9
Indoors 58.4

Imagga
created on 2022-02-26

blackboard 92.9
laptop 56.3
computer 47.5
person 31.6
man 31.6
happy 31.3
male 29.9
smiling 29.7
office 29
adult 28.6
classroom 28
people 27.9
business 27.3
working 26.5
education 25.1
businesswoman 23.6
technology 23
senior 22.5
teacher 22.1
work 22
room 20.8
home 20.7
smile 20
casual 19.5
desk 19
sitting 18.9
professional 18.4
student 18.3
school 17.9
mature 17.7
looking 17.6
indoors 17.6
notebook 17.1
lifestyle 16.6
women 16.6
couple 16.6
television 16.3
portrait 16.2
wireless 15.3
businessman 15
modern 14.7
happiness 14.1
child 13.8
corporate 13.7
men 13.7
success 13.7
class 13.5
family 13.3
table 13.2
together 13.1
telecommunication system 13
executive 12.9
face 12.8
teaching 12.7
elderly 12.4
learning 12.2
hair 11.9
old 11.8
worker 11.6
lady 11.4
attractive 11.2
communication 10.9
mother 10.9
board 10.9
holding 10.7
retirement 10.6
educator 10.3
handsome 9.8
pretty 9.8
job 9.7
studying 9.6
sofa 9.6
parent 9.4
blond 9.2
horizontal 9.2
suit 9
human 9
cheerful 8.9
kid 8.9
typing 8.8
boy 8.7
college 8.5
youth 8.5
finance 8.4
manager 8.4
teamwork 8.3
color 8.3
glasses 8.3
fun 8.2
indoor 8.2
active 8.1
chair 7.8
hands 7.8
students 7.8
older 7.8
concentration 7.7
daughter 7.6
two 7.6
reading 7.6
wife 7.6
book 7.5
meeting 7.5
one 7.5
study 7.5
confident 7.3
group 7.3
cute 7.2
team 7.2
father 7.2

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 99.8
table 94.1
person 86.3
handwriting 77.9
whiteboard 77.9
old 73.8
furniture 71.7
posing 39.4

Face analysis

Amazon

Google

AWS Rekognition

Age 6-14
Gender Male, 78.2%
Sad 99.2%
Calm 0.6%
Happy 0.1%
Confused 0%
Angry 0%
Fear 0%
Disgusted 0%
Surprised 0%

AWS Rekognition

Age 11-19
Gender Female, 67.5%
Calm 52.4%
Disgusted 24.3%
Angry 8%
Happy 6.3%
Sad 5.1%
Confused 1.6%
Fear 1.5%
Surprised 0.8%

AWS Rekognition

Age 24-34
Gender Female, 87%
Calm 85.8%
Sad 5.7%
Fear 2.5%
Confused 1.9%
Happy 1.7%
Surprised 1.1%
Angry 0.8%
Disgusted 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Chair 58.9%

Captions

Microsoft

a vintage photo of a man 86.2%
a vintage photo of a man holding a sign 68.6%
an old photo of a man 68.5%

Text analysis

Amazon

23
KELLER
ALFREK
POLLOCK
ALIFRED
TESAS
TESAS --NACO
--NACO

Google

ALRED
KELLEN
POLLOLK
DVK
KELLEN ALRED POLLOLK DVK CVEE.
CVEE.