Human Generated Data

Title

Untitled (boy playing with blocks)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16875

Human Generated Data

Title

Untitled (boy playing with blocks)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 98.4
Human 98.4
Furniture 94.1
Chair 88.6
Clothing 83.4
Apparel 83.4
Sitting 82.6
Accessories 80.1
Accessory 80.1
Sunglasses 80.1
Face 76.7
Portrait 66
Photography 66
Photo 66
Senior Citizen 66
Screen 65.3
Electronics 65.3
Display 63
Monitor 63
Shelf 60.4
Standing 55.3

Imagga
created on 2022-02-26

person 42.1
player 40.2
crowd 39.3
cheering 39.2
audience 39
stadium 38.9
patriotic 38.3
nation 37.9
flag 37.6
lights 37.1
athlete 36.6
skill 36.6
event 36
muscular 35.3
nighttime 35.2
training 35.1
championship 35
match 34.7
sport 33.8
competition 32.9
silhouette 32.3
people 31.8
field 30.9
television 30.5
team 29.6
park 28.8
ball 27
symbol 26.2
automaton 26.2
icon 25.3
soccer 24.1
telecommunication system 24
kick 23.4
goal 23.1
pass 22.4
shoot 22.3
vibrant 21.9
football 21.1
bright 20.7
teamwork 19.5
glowing 19.4
man 18.8
versus 18.7
male 18.4
world 18.3
design 16.9
shiny 16.6
court 16.6
playing 16.4
shorts 13.7
global 13.7
job 13.3
adult 11.1
helmet 11
portrait 11
backhand 10.9
racket 10.8
serve 10.7
tennis 10.7
stars 10.4
black 10.2
occupation 10.1
sitting 9.4
businessman 8.8
work 8.7
group 8.1
lifestyle 7.9
chair 7.9
men 7.7
cross 7.5
holding 7.4
device 7.4
sexy 7.2
disk jockey 7.2
worker 7.2
face 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 99.9
person 98.6
wall 96.5
man 95.7
indoor 94.8
black and white 93.9
drawing 78.9
human face 65.6
clothing 65.3
old 54
working 53.2

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Male, 99.7%
Calm 93.8%
Confused 1.9%
Disgusted 1.5%
Happy 1.1%
Angry 0.8%
Sad 0.6%
Surprised 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.4%
Sunglasses 80.1%

Captions

Microsoft

a man sitting at a table 82.4%
a man sitting on a table 73.9%
a man sitting on a counter 73.8%

Text analysis

Amazon

Z
m
9
S
I
l
was