Human Generated Data

Title

Untitled (boy holding baby)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17708

Human Generated Data

Title

Untitled (boy holding baby)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 98.9
Human 98.9
Face 95
Clothing 79.7
Apparel 79.7
Photo 71.3
Photography 71.3
Portrait 70
Female 69.8
People 69.3
Child 66.4
Kid 66.4
Girl 65
Skin 61.7
Indoors 61.4
Floor 60
Baby 58.9
Head 58.3
Monitor 56.5
Electronics 56.5
Screen 56.5
Display 56.5
Art 55.1
Person 48.3

Imagga
created on 2022-02-26

grandfather 35
design 21.9
silhouette 21.5
globe 21.3
night 19.5
global 18.2
spectator 17.7
glowing 17.5
planet 17.3
light 16.7
moon 16.6
holiday 16.5
people 15.1
card 14.5
star 14.4
world 14.4
celebration 14.4
nation 14.2
grandma 14.2
graphic 13.9
shiny 13.4
black 13.4
decoration 13
wallpaper 13
bright 12.9
earth 12.8
icon 12.7
crowd 12.5
digital 12.2
symbol 12.1
man 11.9
winter 11.9
space 11.6
person 11.5
male 11.4
sphere 11
tree 10.8
art 10.6
color 10.6
frame 10.3
map 10.3
grunge 10.2
sky 10.2
greeting 10.2
lights 10.2
flag 10.1
happy 10
dark 10
cemetery 9.9
backdrop 9.9
spooky 9.8
new 9.7
horror 9.7
technology 9.6
love 9.5
event 9.2
snow 8.9
style 8.9
cheering 8.8
nighttime 8.8
audience 8.8
stadium 8.8
championship 8.8
vibrant 8.8
continent 8.7
couple 8.7
patriotic 8.6
banner 8.3
competition 8.2
park 8.2
year 8.2
dance 8.1
science 8
bat 7.9
scary 7.7
fear 7.7
international 7.6
vivid 7.4
artwork 7.3
connection 7.3
graphics 7.3
music 7.2
women 7.1
together 7
modern 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 98.4
black 82
toddler 74.3
white 69.9
human face 65.8
person 63.8
black and white 57.8
drawing 56.1

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Male, 84.8%
Calm 99.9%
Sad 0.1%
Disgusted 0%
Confused 0%
Angry 0%
Surprised 0%
Fear 0%
Happy 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%

Captions

Microsoft

a man sitting in front of a window 56.6%
an old photo of a boy 56.5%
an old photo of a man 56.4%