Human Generated Data

Title

Untitled (men and women posed on brick steps)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10550

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men and women posed on brick steps)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10550

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.6
Human 99.6
Shoe 99.5
Footwear 99.5
Clothing 99.5
Apparel 99.5
Person 99.5
Person 99.4
Person 99.1
Person 97.1
Tie 95.8
Accessories 95.8
Accessory 95.8
Person 95.8
Person 95.3
Shoe 94.8
Person 94.4
Person 92.9
Shoe 89.8
Dress 87.8
Musician 87.6
Musical Instrument 87.6
Leisure Activities 82.3
Costume 73
Female 71.1
Shorts 69.1
People 68.5
Guitarist 67.9
Guitar 67.9
Performer 67.9
Brick 66.7
Crowd 66.4
Music Band 65.1
Portrait 62.3
Face 62.3
Photography 62.3
Photo 62.3
Overcoat 59.8
Coat 59.8
Person 59.7
Shoe 59
Suit 59
Woman 58.4
Dance Pose 56.3

Clarifai
created on 2023-10-26

people 99.9
group 98.8
adult 97.1
group together 96.1
man 95.8
uniform 95.6
child 94.7
woman 94.5
many 94.2
administration 93.1
leader 92.9
boy 92.7
military 90.9
education 90.8
position 90
outfit 89.6
wear 88.1
school 84.5
several 83.9
home 79.5

Imagga
created on 2022-01-09

graffito 25
university 24.7
building 23.7
city 22.4
architecture 20.6
decoration 18.2
travel 17.6
history 16.1
house 16
old 14.6
facility 14
gymnasium 13.7
famous 12.1
landmark 11.7
structure 11.1
window 11.1
athletic facility 11
tourism 10.7
sculpture 10.6
statue 10.5
historical 10.3
wheeled vehicle 10.2
church 10.2
street 10.1
art 10
people 10
night 9.8
man 9.4
religion 9
urban 8.7
culture 8.5
palace 8.5
sport 8.4
town 8.3
historic 8.2
outdoors 8.2
square 8.1
ancient 7.8
stone 7.7
sky 7.6
bridge 7.6
monument 7.5
column 7.5
silhouette 7.4
vintage 7.4
light 7.3
holiday 7.2
antique 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

building 99.4
text 98.5
outdoor 96.8
musical instrument 92
person 89.4
black 73.9
clothing 73.5
man 72.9
posing 65.4
white 63.5
old 43.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Male, 99.8%
Surprised 32.9%
Sad 21.2%
Fear 17.5%
Calm 10.1%
Confused 5.6%
Disgusted 5%
Angry 4.2%
Happy 3.6%

AWS Rekognition

Age 42-50
Gender Male, 93.3%
Calm 49.2%
Happy 46.2%
Sad 1.6%
Disgusted 1.1%
Confused 1%
Fear 0.4%
Surprised 0.3%
Angry 0.3%

AWS Rekognition

Age 33-41
Gender Male, 93.9%
Happy 67.2%
Calm 20.4%
Fear 5.4%
Disgusted 2.6%
Sad 1.7%
Angry 1.1%
Confused 0.9%
Surprised 0.8%

AWS Rekognition

Age 54-62
Gender Male, 99.9%
Happy 95%
Calm 3.6%
Surprised 0.8%
Sad 0.2%
Disgusted 0.2%
Confused 0.1%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 49-57
Gender Male, 71.2%
Calm 46.1%
Happy 44.3%
Surprised 3%
Disgusted 1.8%
Angry 1.5%
Fear 1.4%
Sad 1.2%
Confused 0.8%

AWS Rekognition

Age 30-40
Gender Male, 100%
Happy 47.5%
Sad 37.5%
Disgusted 5.2%
Confused 3.4%
Calm 1.9%
Angry 1.8%
Fear 1.5%
Surprised 1.2%

AWS Rekognition

Age 36-44
Gender Male, 99.1%
Calm 32.1%
Happy 28%
Sad 15.5%
Surprised 11.1%
Fear 4.3%
Disgusted 4.1%
Confused 3.3%
Angry 1.6%

AWS Rekognition

Age 30-40
Gender Male, 99.8%
Calm 70.6%
Sad 10.3%
Happy 8.4%
Confused 7.8%
Disgusted 1%
Surprised 0.9%
Fear 0.7%
Angry 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 99.5%
Tie 95.8%

Categories

Text analysis

Amazon

20613.
es

Google

20613. 20613. NAGON-YT3RA2-NAMTZA3 WAWN V
20613.
NAGON-YT3RA2-NAMTZA3
WAWN
V