Human Generated Data

Title

Untitled (man and woman with dog holding rifles)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5147

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man and woman with dog holding rifles)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5147

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.6
Human 99.6
Person 98
People 85.2
Female 83.6
Face 80.4
Apparel 79.3
Clothing 79.3
Mammal 78.8
Animal 78.8
Canine 75.7
Girl 71.3
Photography 66.2
Photo 66.2
Portrait 65.9
Pet 65.8
Woman 64.3
Tree 61.8
Plant 61.8
Horse 59.5
Text 56.9

Clarifai
created on 2023-10-26

people 99.4
dog 98
man 96
adult 95
group together 93.8
monochrome 93.1
canine 92.2
wear 92
group 90.9
woman 90.6
competition 90.1
outdoors 86.8
cavalry 86.3
grass 85.2
child 83.4
outfit 83.1
summer 82.1
mammal 82.1
winter 82
recreation 80.6

Imagga
created on 2022-01-23

negative 51
film 41.2
picket fence 39.8
fence 33.1
snow 31.6
photographic paper 31.1
barrier 23.7
photographic equipment 20.7
winter 17.9
obstruction 15.9
people 15.6
old 15.3
statue 15.1
white 15.1
art 14.5
dress 13.5
sky 13.4
structure 13.3
weather 13
grunge 12.8
frost 12.5
tree 11.7
sculpture 11.6
cemetery 11.5
outdoor 11.5
outdoors 11.2
adult 11
bride 11
cold 10.3
silhouette 9.9
landscape 9.7
man 9.4
architecture 9.4
wedding 9.2
sport 9.1
active 9
color 8.9
light 8.8
decoration 8.8
snowy 8.7
sepia 8.7
forest 8.7
antique 8.6
happiness 8.6
portrait 8.4
fashion 8.3
vintage 8.3
human 8.2
park 8.2
fun 8.2
groom 8
trees 8
love 7.9
couple 7.8
face 7.8
black 7.8
male 7.8
season 7.8
men 7.7
summer 7.7
old fashioned 7.6
wood 7.5
monument 7.5
style 7.4
religion 7.2
activity 7.2
romantic 7.1
grass 7.1
day 7.1
textured 7

Microsoft
created on 2022-01-23

text 99.8
outdoor 97.3
dress 82.1
wedding dress 81.7
black 80.1
woman 72.7
bride 72.3
clothing 68.1
white 67.4
person 61.5
wedding 56.1
old 53.2
vintage 52.7
black and white 51.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 85.3%
Calm 76.3%
Surprised 14.3%
Disgusted 2.8%
Confused 1.9%
Angry 1.8%
Sad 1.2%
Happy 1.2%
Fear 0.6%

AWS Rekognition

Age 28-38
Gender Male, 87.4%
Confused 32.7%
Sad 23.5%
Calm 22.1%
Disgusted 6.6%
Surprised 4.7%
Angry 3.6%
Happy 3.6%
Fear 3.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Categories

Imagga

paintings art 100%

Text analysis

Amazon

sa
13691.

Google

AGO
YT3RA2-NAMTZA3
13691.
sa
AGO YT3RA2-NAMTZA3 13691. sa 13691.