Human Generated Data

Title

Untitled (two children under Christmas tree)

Date

c.1970, printed from 1955 negative

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18230

Human Generated Data

Title

Untitled (two children under Christmas tree)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

c.1970, printed from 1955 negative

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 99.8
Human 99.8
Person 99.7
Wood 97.8
Furniture 91.5
Chair 91.5
Apparel 86.3
Footwear 86.3
Shoe 86.3
Clothing 86.3
Plywood 86
Flooring 82.5
Shoe 76.6
Building 61.4
Floor 60.4
Workshop 59.6
Electronics 56.6
Porch 55

Imagga
created on 2022-02-25

adult 33.4
sitting 30.9
people 29.6
happy 26.9
man 26.9
male 26.7
person 25.5
attractive 23.8
lifestyle 21
happiness 20.4
smiling 19.5
couple 19.2
couch 18.4
home 18.3
smile 17.1
casual 16.9
father 16.5
child 16.4
portrait 16.2
women 15.8
love 15.8
family 15.1
indoors 14.9
fun 14.2
together 14
mother 13.5
world 13.1
cheerful 13
indoor 12.8
dad 12.4
group 12.1
men 12
pretty 11.9
computer 11.3
looking 11.2
parent 11
room 10.9
leisure 10.8
cute 10.8
comfort 10.6
fashion 10.6
body 10.4
business 10.3
black 10.2
model 10.1
playing 10
girls 10
laptop 10
chair 9.9
clothing 9.9
passenger 9.8
handsome 9.8
professional 9.7
sexy 9.6
hair 9.5
adults 9.5
enjoyment 9.4
dark 9.2
businesswoman 9.1
blond 9
activity 9
sofa 8.9
interior 8.8
life 8.8
daughter 8.7
jeans 8.6
togetherness 8.5
studio 8.4
20s 8.2
lady 8.1
urban 7.9
brunette 7.8
husband 7.7
comfortable 7.6
two 7.6
elegance 7.6
holding 7.4
technology 7.4
street 7.4
relaxing 7.3
dress 7.2
suit 7.2
romantic 7.1
working 7.1
work 7.1
modern 7

Google
created on 2022-02-25

Photograph 94.1
Black 89.9
Black-and-white 84.7
Style 83.8
Wood 78.2
Toddler 76.9
Tints and shades 76.5
Monochrome photography 75.4
Monochrome 73.9
Flooring 70.4
Vintage clothing 66.7
Fun 66.3
Sitting 66.2
Stock photography 65.3
Child 64.6
Toy 62.1
Room 62.1
Photo caption 59.3
Play 55.7
Square 52.3

Microsoft
created on 2022-02-25

person 99.3
text 98.4
clothing 96.9
sitting 95.7
indoor 88.4
black and white 71.5
man 68.7
human face 65.4

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 2-8
Gender Male, 77.9%
Calm 79.5%
Sad 18.4%
Confused 0.9%
Angry 0.4%
Disgusted 0.3%
Surprised 0.2%
Fear 0.2%
Happy 0.1%

AWS Rekognition

Age 4-10
Gender Male, 100%
Happy 33.3%
Calm 31.1%
Angry 16.1%
Sad 6.7%
Confused 5.1%
Disgusted 3.6%
Surprised 2.5%
Fear 1.6%

Microsoft Cognitive Services

Age 9
Gender Male

Microsoft Cognitive Services

Age 6
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Chair 91.5%
Shoe 86.3%

Captions

Microsoft

a person sitting on a table 78.5%
a person sitting on a bench 68.8%
a man and a woman sitting on a table 55.5%

Text analysis

Amazon

MIXED
FROM
GL
KEEP
MIXED CEREAL
THIS SIDE GL
AWAY
THIS SIDE
CEREAL
BEAT
I
-
Cradle
Cradle Babe
Babe
Ave Marki - -
PEELA-
Fostur
PAGS
100
PEELA- EII
9 TORIA I Lets
EII
Ave
Marki
PACKEZ
TORIA
TRABLUM
Lets
9

Google

CEREAL
MIXED CEREAL
MIXED