Human Generated Data

Title

Untitled (children at Christmas party in front of tree)

Date

1955

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18219

Human Generated Data

Title

Untitled (children at Christmas party in front of tree)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 99.5
Human 99.5
Clothing 99.5
Apparel 99.5
Person 99.4
Dress 99
Person 99
Blonde 98.4
Woman 98.4
Teen 98.4
Kid 98.4
Female 98.4
Child 98.4
Girl 98.4
Play 97.2
Person 96.9
Shoe 96.5
Footwear 96.5
Face 95.8
Person 91.9
Furniture 90
Chair 90
Pants 88.8
Smile 82.7
Person 81.2
Shoe 76.3
Suit 75.4
Coat 75.4
Overcoat 75.4
Photography 73.4
Photo 73.4
Portrait 73.4
Person 73.1
Shorts 71.6
People 68.2
Jeans 64.8
Denim 64.8
Outdoors 64.2
Plant 62.3
Tree 61.2
Man 61.1
Floor 58.6
Indoors 58.2
Door 57.4
Hand 56.4
Shoe 53.1

Imagga
created on 2022-03-04

person 28.7
man 26.9
people 20.1
black 20
patient 19.5
male 18.5
nurse 18.2
adult 15.9
lifestyle 13.7
indoors 13.2
case 12.8
portrait 12.3
sexy 12
sick person 12
men 12
style 11.9
fashion 11.3
room 10.9
hospital 10.8
face 9.9
attractive 9.8
device 9.7
equipment 9.6
professional 9.2
pretty 9.1
one 9
hairdresser 8.8
women 8.7
musician 8.6
performer 8.4
strength 8.4
elegance 8.4
hand 8.4
health 8.3
vintage 8.3
music 8.2
dress 8.1
clothing 8
looking 8
body 8
working 7.9
couple 7.8
hands 7.8
healthy lifestyle 7.8
concert 7.8
instrument 7.7
exercising 7.7
crutch 7.7
two 7.6
dark 7.5
clothes 7.5
smoke 7.4
training 7.4
retro 7.4
light 7.3
stage 7.3
business 7.3
exercise 7.3
gorgeous 7.2
team 7.2
work 7.1
handsome 7.1
medical 7.1
modern 7

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

text 98.2
clothing 96.1
person 95.8
black and white 83
man 75.1
drawing 57.4

Face analysis

Amazon

AWS Rekognition

Age 35-43
Gender Female, 85.4%
Sad 50.9%
Calm 46.6%
Happy 0.9%
Fear 0.4%
Disgusted 0.4%
Surprised 0.3%
Angry 0.3%
Confused 0.2%

AWS Rekognition

Age 22-30
Gender Female, 76.5%
Calm 73.9%
Sad 18.4%
Fear 3.7%
Angry 1.3%
Happy 1.1%
Disgusted 0.9%
Confused 0.4%
Surprised 0.3%

Feature analysis

Amazon

Person 99.5%
Shoe 96.5%

Captions

Microsoft

a group of people in a room 88.4%
a group of people standing in a room 85.1%
a group of people jumping in the air 56.7%

Text analysis

Amazon

DAVO
DAVO SPOCIETE
MJI7--YT3--X
SPOCIETE

Google

2--AGOX
MJI7--YT33A 2--AGOX
MJI7--YT33A