Human Generated Data

Title

Untitled (man with leg braces)

Date

1974

People

Artist: Ken Whitmire Associates, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19807

Human Generated Data

Title

Untitled (man with leg braces)

People

Artist: Ken Whitmire Associates, American

Date

1974

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Furniture 99.8
Chair 99.8
Person 97.7
Human 97.7
Apparel 97.6
Clothing 97.6
Pants 89.9
Building 85.7
Architecture 85.7
Pillar 74.3
Column 74.3
Stilts 67.7
Jeans 66.9
Denim 66.9
Lighting 66.3
Shorts 60.1
Playground 55.9
Play Area 55.9
Door 55.8
Flooring 55.2

Imagga
created on 2022-03-05

parallel bars 70.9
gymnastic apparatus 59.5
sports equipment 45.4
equipment 36
person 22.9
exercise device 22.2
device 21.8
adult 20
exercise bike 20
body 17.6
exercise 17.2
crutch 16.3
fitness 16.3
sport 14.9
man 14.8
people 13.9
health 13.9
portrait 13.6
one 13.4
black 13.2
lifestyle 13
staff 12.6
modern 12.6
energy 12.6
sexy 12
water 12
training 12
gym 11.5
male 11.3
stick 10.9
bathroom 10.8
active 10.8
fashion 10.5
women 10.3
shower 10.2
attractive 9.8
human 9.7
indoors 9.7
metal 9.6
home 9.6
hair 9.5
athlete 9.5
healthy 9.4
clean 9.2
lady 8.9
style 8.9
cool 8.9
treadmill 8.7
workout 8.6
skin 8.5
pretty 8.4
recreation 8.1
wet 8
posing 8
interior 8
smile 7.8
bike 7.8
model 7.8
pole 7.6
studio 7.6
professional 7.6
house 7.5
dark 7.5
fit 7.4
reflection 7.3
indoor 7.3
face 7.1
life 7

Google
created on 2022-03-05

Shorts 90.8
Fixture 87.2
Style 83.8
Sleeve 80.2
Knee 75.7
Symmetry 74.2
Monochrome photography 74.1
Monochrome 73.7
Elbow 72.8
Balance 70.8
Human leg 70.1
Rib 69.2
Metal 69
Personal protective equipment 66.8
Physical fitness 66.4
Font 64.7
Machine 64.2
Chest 63.5
Stock photography 61.8
Art 61.6

Microsoft
created on 2022-03-05

text 97.9
black and white 66.8
bicycle 66.7

Face analysis

Amazon

AWS Rekognition

Age 40-48
Gender Female, 68.8%
Calm 93.4%
Sad 4.5%
Angry 0.8%
Surprised 0.4%
Happy 0.2%
Disgusted 0.2%
Confused 0.2%
Fear 0.2%

Feature analysis

Amazon

Person 97.7%

Text analysis

Amazon

74-733

Google

74-733
74-733