Human Generated Data

Title

Untitled (three lines of ballet dancers practicing outside)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14623

Human Generated Data

Title

Untitled (three lines of ballet dancers practicing outside)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14623

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Dance Pose 98.4
Leisure Activities 98.4
Person 98.4
Human 98.4
Person 92.7
Person 92.7
Person 91.7
Outdoors 88.5
Nature 84.3
Person 84.3
Person 76.3
Person 67.7
Water 66.1
Dance 63.4
Portrait 61
Photography 61
Face 61
Photo 61
Shorts 55.2
Clothing 55.2
Apparel 55.2

Clarifai
created on 2023-10-27

monochrome 98.6
people 97.9
no person 97.6
water 96.8
nature 92.2
fish 91.2
group 90.4
many 88.3
sea 87.6
child 87.5
man 87.2
art 86.4
retro 85.9
woman 85.5
wood 85.1
street 85
leaf 84.2
underwater 83.4
black and white 81.9
tree 81.5

Imagga
created on 2022-01-29

fastener 26.1
staple 18.6
restraint 18.4
jet 17.7
device 16.7
water 15.3
sky 15.3
paper fastener 15
texture 14.6
grunge 14.5
flight 14.4
fly 14
textured 13.1
aircraft 12.4
design 12.4
pattern 12.3
black 12
travel 12
clothespin 11.7
airplane 11.6
boat 10.9
grungy 10.4
paint 10
backdrop 9.9
art 9.8
tree 9.8
old 9.7
paper 9.4
ship 9.2
material 9.2
transport 9.1
vacation 9
metal 8.8
silver 8.8
steel 8.8
surface 8.8
plane 8.7
high 8.7
vessel 8.6
cable 8.6
holiday 8.6
sea 8.6
flying 8.5
winter 8.5
lake 8.2
speed 8.2
outdoors 8.2
landscape 8.2
industrial 8.2
natural 8
close 8
day 7.8
aviation 7.8
weathered 7.6
power 7.5
drawing 7.5
snow 7.4
structure 7.4
air 7.4
sun 7.2
marina 7.2
color 7.2

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 99.2
black and white 95.5
water 89.9
outdoor 87.1
monochrome 75.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 29-39
Gender Female, 78.3%
Calm 60.9%
Happy 13.5%
Sad 9.6%
Confused 5.5%
Surprised 4%
Disgusted 2.5%
Fear 2.3%
Angry 1.7%

Feature analysis

Amazon

Person
Person 98.4%
Person 92.7%
Person 92.7%
Person 91.7%
Person 84.3%
Person 76.3%
Person 67.7%

Categories

Captions

Microsoft
created on 2022-01-29

an old photo of a bird 40.2%
old photo of a bird 36.8%
a photo of a bird 36.7%

Text analysis

Amazon

NAGOY
MJI3