Human Generated Data

Title

Untitled (young girl with large bullhorn)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8581

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (young girl with large bullhorn)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 98.7
Human 98.7
Clothing 97.3
Apparel 97.3
Person 96.9
Shoe 95.8
Footwear 95.8
People 92.6
Shoe 92
Shoe 86.3
Shoe 77.1
Photography 60.4
Photo 60.4
Military Uniform 59.3
Military 59.3
Suit 59.2
Coat 59.2
Overcoat 59.2
Horn 55.6
Musical Instrument 55.6
Brass Section 55.6

Imagga
created on 2022-01-09

man 27.5
people 27.3
adult 25.3
device 25
male 24.8
weapon 21.9
person 21.8
business 21.2
musical instrument 18.9
megaphone 18.2
suit 17.1
sword 15.9
wind instrument 15.9
acoustic device 15.2
corporate 14.6
men 14.6
black 14.5
businessman 14.1
standing 13.9
professional 13.7
clothing 13.3
fashion 12.8
looking 12.8
brass 12.2
attractive 11.9
women 11.9
portrait 11.6
urban 11.4
human 11.2
building 11.1
holding 10.7
happy 10.6
pretty 10.5
couple 10.4
room 10.2
instrument 10.2
work 10.2
happiness 10.2
dress 9.9
life 9.8
clothes 9.4
occupation 9.2
lady 8.9
job 8.8
two 8.5
hand 8.3
gun 8.3
equipment 8.3
office 8.2
teacher 8.1
working 7.9
lifestyle 7.9
hair 7.9
love 7.9
smile 7.8
executive 7.8
model 7.8
modern 7.7
power 7.6
city 7.5
cheerful 7.3
protection 7.3
uniform 7.3
groom 7.1
day 7.1
indoors 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 98
black and white 94.3
text 89.7
standing 81.2
clothing 75.3
monochrome 69.1

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Male, 99.9%
Calm 75.3%
Happy 20.5%
Disgusted 1.6%
Confused 1.4%
Sad 0.5%
Surprised 0.3%
Angry 0.2%
Fear 0.2%

AWS Rekognition

Age 51-59
Gender Male, 96.9%
Calm 65.3%
Happy 12%
Confused 6.7%
Sad 6.5%
Disgusted 5%
Surprised 2.9%
Angry 0.9%
Fear 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%
Shoe 95.8%

Captions

Microsoft

a man and a woman standing in front of a building 50.1%
a man and a woman standing next to a building 50%
a person standing in front of a building 49.9%

Text analysis

Amazon

17795
17795.
use
.S6LLI

Google

5 יררו MAGOX-YT37A8-AMT8AS כוררן
5
יררו
MAGOX-YT37A8-AMT8AS
כוררן