Human Generated Data

Title

Untitled (group of young men sitting on steps)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19525

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of young men sitting on steps)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19525

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.5
Human 99.5
Person 99.3
Person 98.8
Person 98.5
Person 98.1
Person 97.5
Person 96.9
Person 96.1
Pedestrian 94.6
Person 92.4
Person 91
Person 86.4
Person 84.4
Crowd 84.2
People 82.3
Person 81.5
Person 79
Person 78.6
Nature 76.2
Building 75.7
Outdoors 75.2
Urban 75.2
City 71.4
Town 71.4
Person 70.6
Architecture 67.9
Clothing 66.2
Apparel 66.2
Person 61.1
Text 60.6
Person 60.5
Photography 60.2
Photo 60.2
Downtown 60
Person 59.9
Shorts 58
Sport 57.3
Sports 57.3
Audience 55.5
Person 54.3
Person 50.3

Clarifai
created on 2023-10-22

people 100
many 99.9
group 99.3
group together 99.1
crowd 97.9
adult 97.7
woman 97.2
man 96.8
spectator 93.8
child 93.5
street 90.7
leader 90.2
wear 88.4
administration 88.3
music 86.1
recreation 85.7
several 84.9
vehicle 82.9
audience 79.6
military 77.7

Imagga
created on 2022-03-05

man 28.9
people 23.4
sport 20.1
male 19.2
city 19.1
person 15.8
silhouette 15.7
building 14.5
musical instrument 13.1
world 13
men 12.9
photographer 12.4
walking 12.3
group 12.1
sunset 11.7
wind instrument 11.6
couple 11.3
outdoors 10.8
crowd 10.6
beach 10.1
outdoor 9.9
business 9.7
spectator 9.5
active 9.4
adult 9.2
travel 9.2
danger 9.1
dirty 9
life 9
urban 8.7
boy 8.7
lifestyle 8.7
day 8.6
athlete 8.4
street 8.3
protection 8.2
runner 8.2
women 7.9
together 7.9
soldier 7.8
destruction 7.8
nuclear 7.8
military 7.7
gas 7.7
walk 7.6
pedestrian 7.6
brass 7.4
park 7.4
black 7.3
industrial 7.3
private 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 95
person 94.1
footwear 92.1
clothing 91.5
outdoor 90.8
man 82.2
black and white 68.7
people 63.1
crowd 1.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 16-22
Gender Female, 76%
Calm 38.4%
Happy 28.1%
Sad 10%
Surprised 9.9%
Fear 6.5%
Angry 2.7%
Disgusted 2.3%
Confused 2.1%

AWS Rekognition

Age 16-22
Gender Male, 88.9%
Calm 63.7%
Fear 15.7%
Sad 15.3%
Happy 1.3%
Angry 1.1%
Disgusted 1%
Surprised 1%
Confused 1%

AWS Rekognition

Age 41-49
Gender Male, 96.6%
Sad 63.5%
Calm 19.8%
Fear 4.6%
Angry 3.4%
Disgusted 2.8%
Confused 2.5%
Happy 2.5%
Surprised 0.8%

AWS Rekognition

Age 37-45
Gender Male, 95.8%
Calm 82.3%
Sad 7.8%
Fear 2.4%
Disgusted 2.4%
Confused 1.7%
Happy 1.2%
Angry 1.1%
Surprised 1%

AWS Rekognition

Age 54-62
Gender Male, 59.7%
Sad 91.6%
Confused 3.3%
Calm 2.4%
Happy 1.3%
Angry 0.4%
Surprised 0.4%
Disgusted 0.4%
Fear 0.2%

AWS Rekognition

Age 34-42
Gender Male, 95.7%
Calm 95.4%
Happy 4%
Disgusted 0.2%
Confused 0.1%
Sad 0.1%
Surprised 0.1%
Angry 0%
Fear 0%

AWS Rekognition

Age 25-35
Gender Male, 68.8%
Calm 90.4%
Sad 4.7%
Confused 3.1%
Disgusted 0.6%
Angry 0.4%
Fear 0.3%
Surprised 0.3%
Happy 0.2%

AWS Rekognition

Age 36-44
Gender Female, 76.3%
Calm 60.2%
Sad 22.4%
Confused 6.3%
Fear 4.1%
Happy 2.6%
Angry 2.1%
Disgusted 1.6%
Surprised 0.6%

AWS Rekognition

Age 21-29
Gender Female, 62%
Calm 88.9%
Happy 5.2%
Fear 2.1%
Sad 1.8%
Disgusted 0.6%
Confused 0.6%
Surprised 0.4%
Angry 0.4%

AWS Rekognition

Age 24-34
Gender Female, 91.3%
Calm 83.1%
Sad 5%
Disgusted 3.9%
Confused 2.4%
Fear 1.9%
Surprised 1.6%
Angry 1.2%
Happy 0.9%

AWS Rekognition

Age 11-19
Gender Female, 98.5%
Fear 43.8%
Calm 31.5%
Sad 13.2%
Confused 3.5%
Surprised 2.8%
Angry 2.7%
Disgusted 1.5%
Happy 0.9%

AWS Rekognition

Age 37-45
Gender Male, 97.8%
Calm 72%
Happy 23.9%
Confused 1.9%
Disgusted 0.7%
Sad 0.6%
Surprised 0.3%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 20-28
Gender Female, 68.5%
Calm 95%
Sad 2.2%
Confused 1%
Happy 0.5%
Surprised 0.5%
Angry 0.4%
Disgusted 0.3%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person
Person 99.5%
Person 99.3%
Person 98.8%
Person 98.5%
Person 98.1%
Person 97.5%
Person 96.9%
Person 96.1%
Person 92.4%
Person 91%
Person 86.4%
Person 84.4%
Person 81.5%
Person 79%
Person 78.6%
Person 70.6%
Person 61.1%
Person 60.5%
Person 59.9%
Person 54.3%
Person 50.3%

Text analysis

Amazon

YT33A2
٢٥٥٤
YT33A2 و محد
و محد