Human Generated Data

Title

Untitled (standing man with artificial leg)

Date

1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8252

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (standing man with artificial leg)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1938

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Architecture 99.5
Building 99.5
Person 98.1
Human 98.1
Pillar 95.1
Column 95.1
Stilts 93.7
Clothing 89.5
Apparel 89.5
Racket 85.2
Tennis Racket 85.2
Coat 68.7
Home Decor 57.8

Imagga
created on 2022-02-05

crutch 100
staff 100
stick 98.7
fashion 27.9
people 26.2
person 25
adult 24.6
dress 24.4
lady 20.3
attractive 19.6
model 19.4
portrait 19.4
man 15.4
sexy 15.2
hair 15
pretty 14.7
happy 13.8
cute 13.6
body 13.6
outdoors 13.4
park 13.2
outdoor 13
one 12.7
posing 12.4
standing 11.3
style 11.1
winter 11.1
elegance 10.9
light 10.7
male 10.6
human 10.5
walking 10.4
black 10.2
joy 10
sensual 10
active 9.9
sport 9.9
cold 9.5
love 9.5
expression 9.4
skin 9.3
face 9.2
make 9.1
old 9.1
summer 9
cheerful 8.9
couple 8.7
lifestyle 8.7
bride 8.6
elegant 8.6
tree 8.5
snow 8.5
clothing 8.4
sun 8
day 7.8
happiness 7.8
forest 7.8
life 7.8
wall 7.7
seductive 7.6
clothes 7.5
wedding 7.4
sensuality 7.3
gorgeous 7.2
stylish 7.2
looking 7.2
smile 7.1
spring 7.1
sky 7
season 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

tree 99.6
text 99.2
outdoor 97.9
person 81.4
clothing 73.5
man 64.1
footwear 56.8
picture frame 9.6

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Male, 91.9%
Surprised 48.4%
Calm 44.8%
Happy 3.7%
Disgusted 1.4%
Sad 0.7%
Angry 0.4%
Confused 0.4%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.1%

Captions

Microsoft

a person standing next to a road 78.7%
a person walking down the street 75.8%
a person walking down a street 75.7%

Text analysis

Amazon

7666
MJI7
7666.
"
A70A
MJI7 ПТАЯТIИ A70A
ПТАЯТIИ

Google

7666
7666 766
766