Human Generated Data

Title

Untitled (two children with baby)

Date

c. 1950

People

Artist: Lainson Studios,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21868

Human Generated Data

Title

Untitled (two children with baby)

People

Artist: Lainson Studios,

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Furniture 100
Apparel 98.9
Clothing 98.9
Human 98.9
Person 98.9
Chair 98.8
Person 98.5
Person 97.5
Sitting 97.4
Chair 96.3
Shorts 91.7
Dress 80.7
Plant 79.7
Grass 78.7
Female 77.5
Face 72.6
Portrait 70.4
Photo 70.4
Photography 70.4
Table 70.4
Outdoors 66.9
Man 63.7
Kid 63.7
Child 63.7
Baby 62.8
Woman 60.9
Girl 60.2
Couch 59.7
Bench 57.2
Hat 56.9
Text 56.1

Imagga
created on 2022-03-11

chair 34.9
dancer 33
seat 26.8
person 26.4
rocking chair 25.1
performer 22.8
people 21.2
negative 20.7
sport 20.6
black 17.4
film 17.3
athlete 16.6
furniture 16
skill 15.4
muscular 15.3
body 15.2
training 14.8
art 14.7
anatomy 14.5
entertainer 14.4
adult 14.4
sculpture 14.4
style 14.1
fashion 13.6
player 13.1
event 12.9
flag 12.8
symbol 12.8
male 12.8
man 12.8
skeleton 12.7
stadium 12.6
fitness 12.6
photographic paper 12.6
fight 12.6
dance 12.5
science 12.4
silhouette 12.4
park 12.3
lights 12.1
nighttime 11.7
patriotic 11.5
biology 11.4
nation 11.4
against 11
field 10.9
pose 10.9
versus 10.8
lifestyle 10.8
cheering 10.8
shorts 10.8
hip 10.7
audience 10.7
championship 10.7
cool 10.6
match 10.6
jump 10.6
crowd 10.6
human 10.5
portrait 10.3
icon 10.3
competition 10.1
exercise 10
active 9.9
studio 9.9
history 9.8
modern 9.8
skull 9.8
gloves 9.7
dangerous 9.5
culture 9.4
action 9.3
old 9
lady 8.9
uppercut 8.9
jab 8.9
ropes 8.9
folding chair 8.8
punch 8.8
aerobics 8.8
boxing 8.8
bone 8.8
statue 8.7
dancing 8.7
block 8.6
model 8.5
ring 8.5
photographic equipment 8.4
health 8.3
furnishing 8
medical 7.9
x ray 7.8
jumping 7.7
motion 7.7
hand 7.6
arm 7.6
legs 7.5
fun 7.5
one 7.5
leg 7.4
design 7.3
teenager 7.3
sexy 7.2
activity 7.2
posing 7.1
face 7.1

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

text 96.6
furniture 95.9
chair 93
table 72.2
black and white 69.3
person 68.2
old 63.5

Face analysis

Amazon

AWS Rekognition

Age 51-59
Gender Female, 95.8%
Calm 99.4%
Happy 0.3%
Sad 0.1%
Surprised 0.1%
Disgusted 0.1%
Confused 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 7-17
Gender Male, 84.6%
Happy 55.4%
Fear 29.8%
Calm 5.4%
Sad 5.3%
Surprised 2.5%
Angry 1.3%
Disgusted 0.3%
Confused 0.1%

AWS Rekognition

Age 25-35
Gender Female, 59.2%
Calm 98.2%
Sad 1.4%
Angry 0.2%
Happy 0.1%
Disgusted 0%
Confused 0%
Surprised 0%
Fear 0%

Feature analysis

Amazon

Person 98.9%
Chair 98.8%

Captions

Microsoft

a vintage photo of a person sitting on a chair 74.9%
a vintage photo of a person sitting in a chair 74.8%
a vintage photo of a man and a woman sitting on a chair 50%

Text analysis

Amazon

85
KODYK-COVELA