Human Generated Data

Title

Untitled (man teaching young boys to play baseball)

Date

1959

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10473

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man teaching young boys to play baseball)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1959

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10473

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.8
Human 99.8
Person 99.7
Person 99.6
Person 98
Nature 97
Outdoors 96.8
Person 96.5
Clothing 94.4
Apparel 94.4
Countryside 86.8
Person 86.3
Building 85.5
Meal 85.2
Food 85.2
Rural 84.9
Shelter 84.9
Person 83.7
Person 82.9
Person 79.2
Person 78.2
Shorts 75
People 74.9
Hut 74.2
Shack 74
Face 73.5
Helmet 72.8
Table 70.8
Furniture 70.8
Cafeteria 70.2
Restaurant 70.2
Female 62.8
Person 62.2
Helmet 61.7
Kid 61.6
Child 61.6
Dugout 61.2
Dress 60.3
Crowd 60.2
Girl 57.9
Person 56.8
Ice 55.3
Room 55.3
Indoors 55.3
Person 46.5

Clarifai
created on 2023-10-25

people 99.8
group 98.4
many 98.3
group together 98.1
man 95.1
adult 94.5
baseball 93
woman 91.3
wear 90.4
interaction 89.4
monochrome 87.9
crowd 83.2
child 82.6
recreation 80.1
administration 79.4
uniform 79.1
outfit 78.2
commerce 76
actor 75.9
education 75.9

Imagga
created on 2022-01-09

man 25.6
person 19.8
male 18.5
people 14.5
spectator 14.1
statue 12.8
symbol 12.8
dark 12.5
adult 12.5
old 11.8
history 11.6
building 11.2
protection 10.9
destruction 10.7
outdoor 10.7
military 10.6
black 10.2
sculpture 10
danger 10
soldier 9.8
portrait 9.7
nuclear 9.7
sport 9.6
men 9.4
city 9.1
industrial 9.1
sky 8.9
art 8.9
weapon 8.9
war 8.7
mask 8.6
ancient 8.6
architecture 8.6
outdoors 8.3
dirty 8.1
disaster 7.8
army 7.8
travel 7.7
stage 7.7
explosion 7.7
sign 7.5
light 7.4
musical instrument 7.3
uniform 7.2
work 7.2
memorial 7.2
worker 7.1
clothing 7.1
brass 7

Google
created on 2022-01-09

Black 89.7
Black-and-white 84.2
Hat 79.4
Monochrome photography 74.2
Font 73.8
Crew 73.7
Monochrome 71.6
Vintage clothing 68.5
Event 65.5
Suit 64.9
History 63.8
Sitting 63.4
Team 63.2
Art 62.9
Room 61.6
Paper product 55.2
Window 55.1
Photographic paper 55
Rectangle 53.6
Photo caption 53.2

Microsoft
created on 2022-01-09

text 99.2
person 96.4
clothing 90.8
bottle 88.1
man 78.3
posing 40.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 75.4%
Calm 90.8%
Sad 8%
Happy 0.7%
Confused 0.3%
Angry 0.1%
Surprised 0.1%
Fear 0.1%
Disgusted 0%

AWS Rekognition

Age 37-45
Gender Male, 97.1%
Calm 98.4%
Sad 0.8%
Confused 0.5%
Happy 0.2%
Angry 0.1%
Surprised 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 30-40
Gender Male, 57.4%
Happy 97.8%
Sad 1%
Calm 0.4%
Confused 0.4%
Surprised 0.1%
Angry 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 26-36
Gender Female, 53.2%
Calm 95.2%
Happy 1.9%
Disgusted 1%
Sad 0.5%
Confused 0.5%
Angry 0.4%
Fear 0.3%
Surprised 0.2%

AWS Rekognition

Age 18-26
Gender Male, 99.1%
Calm 89%
Sad 9.9%
Confused 0.4%
Fear 0.2%
Disgusted 0.1%
Angry 0.1%
Surprised 0.1%
Happy 0.1%

AWS Rekognition

Age 18-24
Gender Male, 95.7%
Calm 99.3%
Happy 0.4%
Sad 0.2%
Angry 0%
Disgusted 0%
Surprised 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 24-34
Gender Female, 78.3%
Calm 99.6%
Fear 0.3%
Sad 0%
Surprised 0%
Happy 0%
Confused 0%
Angry 0%
Disgusted 0%

AWS Rekognition

Age 25-35
Gender Female, 88.4%
Calm 98%
Happy 0.8%
Sad 0.6%
Surprised 0.3%
Fear 0.1%
Angry 0.1%
Disgusted 0.1%
Confused 0%

AWS Rekognition

Age 26-36
Gender Male, 99.5%
Calm 80.8%
Sad 15.7%
Confused 1.5%
Happy 0.8%
Angry 0.5%
Surprised 0.3%
Fear 0.3%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Helmet 72.8%

Text analysis

Amazon

445
BOX
LKER
445 39.tirw.
39.tirw.