Human Generated Data

Title

Untitled (men posed with racing dog)

Date

1972

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11570

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men posed with racing dog)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1972

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Clothing 99.3
Apparel 99.3
Person 98.9
Human 98.9
Person 98.8
Person 98.6
Person 98.6
Person 98.5
Overcoat 94.3
Coat 94.3
Suit 94.3
Shorts 91.6
Person 90
Chair 83.8
Furniture 83.8
Pedestrian 77.7
Female 76
Jacket 73.1
Blazer 73.1
People 72.1
Hand 65.5
Sailor Suit 61.3
Tuxedo 60.9
Girl 60.5
Vehicle 58.7
Transportation 58.7
Long Sleeve 56.6
Sleeve 56.6
Water 55.5
Waterfront 55.5
Face 55.3

Imagga
created on 2022-01-15

shopping cart 35.1
chair 32.8
handcart 29.2
man 28.9
pedestrian 28.8
people 27.9
rocking chair 26.4
wheeled vehicle 26
seat 23.6
person 18.8
male 18.4
container 16.8
business 15.2
city 14.1
lifestyle 13.7
silhouette 13.2
urban 13.1
adult 13.1
sky 12.8
outdoors 12.7
day 12.6
walk 12.4
sitting 12
men 12
beach 11.8
furniture 11.4
water 11.3
travel 11.3
sea 10.9
businessman 10.6
summer 10.3
relax 10.1
human 9.7
group 9.7
sun 9.7
office 9.6
sand 9.6
boy 9.6
walking 9.5
conveyance 9.3
shore 9.3
life 9.3
outdoor 9.2
ocean 9.1
job 8.8
working 8.8
happy 8.8
couple 8.7
women 8.7
standing 8.7
crowd 8.6
architecture 8.6
corporate 8.6
building 8.3
vacation 8.2
sunset 8.1
transportation 8.1
airport 7.8
glass 7.8
sunny 7.7
meeting 7.5
device 7.4
street 7.4
back 7.3
work 7.3
smiling 7.2
suit 7.2
smile 7.1
portrait 7.1
interior 7.1
happiness 7.1
together 7
professional 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 91.7
man 91.2
person 86.5
dog 84.9
clothing 61.4

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 100%
Sad 79.4%
Confused 17.9%
Calm 1.5%
Disgusted 0.4%
Happy 0.3%
Fear 0.2%
Surprised 0.2%
Angry 0.2%

AWS Rekognition

Age 24-34
Gender Male, 76.3%
Sad 76.7%
Confused 8.3%
Calm 7.9%
Angry 4.1%
Fear 1.2%
Disgusted 0.8%
Happy 0.6%
Surprised 0.4%

AWS Rekognition

Age 54-64
Gender Male, 99.6%
Calm 85.5%
Happy 10.6%
Confused 1.6%
Disgusted 0.7%
Angry 0.5%
Fear 0.4%
Sad 0.4%
Surprised 0.4%

AWS Rekognition

Age 48-54
Gender Male, 99%
Sad 80.7%
Happy 9.8%
Fear 3%
Surprised 1.8%
Calm 1.8%
Confused 1.1%
Angry 1%
Disgusted 0.8%

AWS Rekognition

Age 50-58
Gender Male, 99.8%
Confused 66.8%
Calm 15.6%
Disgusted 6.2%
Sad 5.8%
Happy 2.7%
Angry 1%
Fear 0.9%
Surprised 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%

Captions

Microsoft

a man standing next to a dog 38.9%
a man standing on a court 38.8%
a group of people standing next to a man 38.7%

Text analysis

Amazon

59846.
STAKE
НА
SFC
LA
722 MEDDEY

Google

59846. Y2MEDLEY
Y2MEDLEY
59846.