Human Generated Data

Title

Untitled (group of men and a woman with rifles on porch steps, Jos. Wharton Estate (Lippincott))

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5127

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of men and a woman with rifles on porch steps, Jos. Wharton Estate (Lippincott))

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5127

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.6
Human 99.6
Person 99.2
Person 99
Person 98.9
Person 98.9
Person 96.9
Person 96.7
Person 94.3
Clothing 93.1
Apparel 93.1
People 86
Person 72.5
Person 69.3
Outdoors 67.9
Person 67.7
Face 66.3
Female 66.2
Photography 62.6
Photo 62.6
Person 62.5
Shorts 58.1
Girl 57.2
Furniture 56
Suit 55.7
Coat 55.7
Overcoat 55.7
Building 55.3
Housing 55.2

Clarifai
created on 2023-10-26

people 99.9
group 97.5
child 96.2
adult 96.2
man 96.1
many 94.1
woman 90.6
education 89.9
uniform 86.8
group together 86.3
monochrome 84.9
administration 84.7
boy 84.7
wear 83.1
musician 79.1
music 78.3
actor 77
sit 76.9
leader 75.6
school 75.3

Imagga
created on 2022-01-23

negative 38.3
film 35.5
fountain 27.7
photographic paper 25.9
structure 25.4
architecture 21.1
photographic equipment 17.3
history 17
city 16.6
water 16
house 14.5
building 14.1
travel 14.1
wheeled vehicle 13.6
old 13.2
art 12.9
sky 12.1
light 12
construction 12
landscape 11.9
river 11.6
night 11.5
stone 11.4
ancient 11.2
freight car 11.1
grunge 10.2
fixture 10.2
street 10.1
silhouette 9.9
landmark 9.9
tower 9.8
scenic 9.6
bridge 9.4
drawing 9.4
outdoor 9.2
tourism 9.1
environment 9
retro 9
statue 8.9
symbol 8.7
urban 8.7
rock 8.7
scene 8.7
decoration 8.5
wagon 8.5
historical 8.5
car 8.4
plumbing fixture 8.3
vintage 8.3
ecology 8.2
scenery 8.1
sculpture 8.1
business 7.9
design 7.9
forest 7.8
black 7.8
cityscape 7.6
park 7.5
outdoors 7.5
famous 7.4
reflection 7.4
exterior 7.4
container 7.3
aged 7.2
window 7.2
religion 7.2

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 98.1
outdoor 89.6
person 86.2
clothing 83.6
wedding dress 64.5
old 62.1
wedding 56.6
posing 56

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 82.7%
Calm 28%
Disgusted 21.4%
Sad 14.8%
Confused 12.8%
Angry 9.1%
Happy 8.8%
Surprised 3.1%
Fear 2.1%

AWS Rekognition

Age 43-51
Gender Female, 95.1%
Calm 55.2%
Happy 20.7%
Sad 7%
Confused 6.5%
Disgusted 3.8%
Fear 2.9%
Angry 2.5%
Surprised 1.4%

AWS Rekognition

Age 31-41
Gender Female, 98.4%
Happy 83.5%
Confused 6.4%
Calm 2.4%
Sad 1.9%
Disgusted 1.6%
Surprised 1.6%
Fear 1.4%
Angry 1.2%

AWS Rekognition

Age 36-44
Gender Male, 72.7%
Happy 39.8%
Sad 17.4%
Calm 14.3%
Fear 11.3%
Confused 6.5%
Surprised 5.1%
Disgusted 3.4%
Angry 2.1%

AWS Rekognition

Age 37-45
Gender Female, 95.2%
Happy 95%
Calm 4.1%
Fear 0.2%
Confused 0.2%
Sad 0.2%
Disgusted 0.1%
Angry 0.1%
Surprised 0.1%

AWS Rekognition

Age 35-43
Gender Male, 50.9%
Calm 87%
Fear 7%
Confused 1.6%
Happy 1.6%
Sad 1.3%
Surprised 0.7%
Disgusted 0.5%
Angry 0.4%

AWS Rekognition

Age 41-49
Gender Female, 89.3%
Calm 99.3%
Sad 0.3%
Happy 0.2%
Fear 0.1%
Confused 0.1%
Surprised 0.1%
Angry 0.1%
Disgusted 0%

AWS Rekognition

Age 37-45
Gender Male, 95.3%
Calm 40.9%
Fear 33.9%
Happy 16.5%
Sad 4.4%
Confused 1.6%
Angry 1.1%
Surprised 1.1%
Disgusted 0.6%

AWS Rekognition

Age 31-41
Gender Male, 69.4%
Happy 52.4%
Calm 43.5%
Surprised 1.8%
Sad 1.2%
Confused 0.5%
Angry 0.3%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 37-45
Gender Male, 97.6%
Calm 83.3%
Happy 5.4%
Confused 4.8%
Fear 2.1%
Sad 1.7%
Angry 1.1%
Surprised 0.9%
Disgusted 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Text analysis

Amazon

15168
A70
15:168
MJI7 YT37AS A70
MJI7
YT37AS

Google

15168- 15168 MJI7 YT3RA2 A7 81 の
15168-
15168
MJI7
YT3RA2
A7
81