Human Generated Data

Title

Untitled (two women and a man with dog standing out side circus train)

Date

1948

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5359

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two women and a man with dog standing out side circus train)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5359

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Dog 99
Mammal 99
Animal 99
Canine 99
Pet 99
Person 99
Human 99
Person 98.9
Person 98.6
Clothing 93.1
Apparel 93.1
Outdoors 71
Wolf 67.2
Nature 66.5
Female 62.9
Face 62.1
People 60.3
Coat 59.3
Play 58.9
Photography 58.8
Photo 58.8
Shelter 56.5
Countryside 56.5
Building 56.5
Rural 56.5
Girl 55.5

Clarifai
created on 2023-10-26

people 99.7
man 98.4
adult 98.1
dog 97.1
canine 97
group together 95.2
wear 94.6
woman 92.8
two 92.4
monochrome 90.7
group 90.6
street 86.2
interaction 81.1
actor 80.5
four 76.8
portrait 75.5
three 75.5
child 75.4
military 74.7
music 74.3

Imagga
created on 2022-01-23

people 26.2
person 24.4
man 21.5
adult 19.4
male 16.5
dress 16.3
portrait 15.5
happiness 14.9
sketch 14.4
love 14.2
couple 13.9
face 13.5
happy 13.2
snow 13.1
men 12.9
sexy 12.8
child 12.8
fashion 12.1
winter 11.9
drawing 11.6
black 11.4
human 11.2
women 11.1
family 10.7
pretty 10.5
hair 10.3
wall 10.3
dog 10.1
attractive 9.8
bride 9.8
clothing 9.5
cold 9.5
head 9.2
outdoor 9.2
life 9.1
outdoors 9
cheerful 8.9
groom 8.8
sepia 8.7
smiling 8.7
holiday 8.6
smile 8.6
marriage 8.5
casual 8.5
relaxation 8.4
old 8.4
wedding 8.3
sensuality 8.2
lady 8.1
cool 8
lifestyle 7.9
representation 7.8
outside 7.7
health 7.6
skin 7.6
mother 7.6
elegance 7.6
nurse 7.5
city 7.5
room 7.4
teenager 7.3
dirty 7.2
looking 7.2
body 7.2
active 7.2
home 7.2
cute 7.2
posing 7.1
interior 7.1
work 7.1
day 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

outdoor 98.3
text 97.2
dog 91.9
carnivore 85.4
clothing 67.1
animal 60.8
posing 50.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Female, 85.7%
Happy 51.8%
Sad 16.5%
Confused 11.9%
Fear 6.8%
Calm 5.4%
Angry 3.2%
Surprised 2.9%
Disgusted 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Dog 99%
Person 99%

Categories

Imagga

paintings art 100%

Text analysis

Amazon

24026
2VEE1X

Google

'24026
'24026