Human Generated Data

Title

Untitled (dancers arranged in three lines, dancing outside, seen from above)

Date

1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14597

Human Generated Data

Title

Untitled (dancers arranged in three lines, dancing outside, seen from above)

People

Artist: Jack Gould, American

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14597

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Dance Pose 99.5
Leisure Activities 99.5
Person 97.8
Human 97.8
Person 95.4
Person 92.2
Person 83.9
Dance 75
Person 73.5
Person 71.9
Bird 64
Animal 64
Portrait 63.3
Photography 63.3
Face 63.3
Photo 63.3
Crowd 62.8
People 57.1

Clarifai
created on 2023-10-27

people 99.3
monochrome 98.1
fish 97.4
group 97
water 94.8
child 92
sea 91.7
man 91.5
many 91.2
no person 90.7
art 90.4
woman 88.9
street 87.6
nature 85.3
military 84.9
ocean 84.1
underwater 82.9
group together 82.9
wear 80.9
several 80.3

Imagga
created on 2022-01-29

fastener 37.7
staple 36.5
paper fastener 29.3
restraint 26.2
device 19
jet 16.4
texture 16
grunge 14.5
old 13.9
fly 13
paper 12.5
flight 12.5
pattern 12.3
design 11.8
vintage 11.6
clothespin 11.6
silver 11.5
textured 11.4
black 10.8
airplane 10.6
travel 10.6
aircraft 10.5
color 10
water 10
paint 9.9
material 9.9
backdrop 9.9
steel 9.7
metal 9.6
grungy 9.5
art 9.2
industrial 9.1
dirty 9
sky 8.9
backgrounds 8.9
detail 8.8
sea 8.6
close 8.6
flying 8.5
aged 8.1
closeup 8.1
surface 7.9
holiday 7.9
boat 7.8
space 7.7
plane 7.7
cable 7.7
wallpaper 7.6
weathered 7.6
iron 7.5
metallic 7.4
template 7.3
lake 7.3
structure 7.1
vehicle 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 99.5
black and white 88.4
image 31
several 15.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 31-41
Gender Female, 72.3%
Happy 68.8%
Calm 18.8%
Sad 8%
Confused 2%
Angry 0.8%
Surprised 0.7%
Fear 0.5%
Disgusted 0.4%

AWS Rekognition

Age 37-45
Gender Female, 87.5%
Sad 53.1%
Calm 39.7%
Happy 3.2%
Confused 1.7%
Angry 1.1%
Disgusted 0.6%
Fear 0.4%
Surprised 0.2%

AWS Rekognition

Age 16-22
Gender Male, 98.7%
Calm 88.7%
Fear 3.6%
Sad 2.9%
Happy 2.7%
Angry 0.8%
Disgusted 0.5%
Confused 0.5%
Surprised 0.4%

Feature analysis

Amazon

Person
Bird
Person 97.8%
Person 95.4%
Person 92.2%
Person 83.9%
Person 73.5%
Person 71.9%
Bird 64%

Categories

Text analysis

Google

MJ3-- YT33A
MJ3--
YT33A