Human Generated Data

Title

Untitled (woman with tennis racket standing at net)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7547

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman with tennis racket standing at net)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.7
Human 99.7
Clothing 99.2
Shoe 99.2
Footwear 99.2
Apparel 99.2
Shoe 97.1
Tennis Racket 93.5
Racket 93.5
Sport 85.7
Sports 85.7
Tennis Court 78.3
Shorts 75
Tennis Racket 70.8
Asphalt 56.5
Tarmac 56.5
Tennis 56.2

Imagga
created on 2022-01-08

man 32.2
barrier 31.6
obstruction 23.5
people 22.9
male 22.1
crutch 21.7
person 20.4
adult 20.1
business 19.4
staff 17.9
sport 16.9
stick 16.4
structure 16
businessman 15.9
outdoors 15.7
attractive 15.4
portrait 14.2
suit 13.8
city 12.5
urban 12.2
men 12
happy 11.9
professional 11.9
work 11.8
architecture 11.7
black 11.6
lifestyle 11.6
job 11.5
sexy 11.2
corporate 11.2
outside 11.1
day 11
outdoor 10.7
fashion 10.5
pretty 10.5
women 10.3
company 10.2
active 10
exercise 10
building 9.9
worker 9.8
ball 9.8
human 9.7
tennis 9.7
fun 9.7
success 9.7
wall 9.6
street 9.2
office 8.9
court 8.8
athlete 8.7
walk 8.6
competition 8.2
student 8.1
stylish 8.1
fitness 8.1
posing 8
cute 7.9
standing 7.8
modern 7.7
hand 7.6
career 7.6
one 7.5
silhouette 7.4
holding 7.4
style 7.4
action 7.4
sidewalk 7.4
alone 7.3
playing 7.3
businesswoman 7.3
smiling 7.2
recreation 7.2
handsome 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

outdoor 99
text 98.7
street 90.5
black and white 87.7
tennis 86.1
athletic game 85.6
sport 83.7
footwear 83.6
white 67.5
monochrome 61.5

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 77.1%
Calm 99.3%
Sad 0.3%
Happy 0.1%
Surprised 0.1%
Confused 0.1%
Angry 0.1%
Disgusted 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 99.2%

Captions

Microsoft

a person with a racket 82.9%
a person is holding a racket 74.2%
a person holding a sign 74.1%

Text analysis

Amazon

a
15850.

Google

HAGON-YT3RA2-
MAMTZA
HAGON-YT3RA2- MAMTZA 3 15850.
15850.
3