Human Generated Data

Title

Untitled (two women and a man with horseshoes and horse)

Date

c. 1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4544

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two women and a man with horseshoes and horse)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4544

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.7
Human 99.7
Horse 98.7
Animal 98.7
Mammal 98.7
Person 98.1
Horse 97.3
Person 96.7
Clothing 67.6
Apparel 67.6
Female 65.6
Photography 62.8
Photo 62.8
Colt Horse 58
Spoke 57.9
Machine 57.9

Clarifai
created on 2023-10-15

people 99.6
cavalry 99.4
monochrome 96.6
man 95.3
adult 92.4
group together 91.9
group 88.1
three 86.8
woman 83.1
lid 82.2
seated 81.9
mammal 80.4
many 80.1
child 78.1
two 77
uniform 76.6
sport 76
transportation system 75.7
street 75.3
police 75.2

Imagga
created on 2021-12-14

person 21
people 18.9
man 17.5
adult 17.2
male 17
black 13.8
men 12.9
fashion 12.8
clothing 12.6
portrait 12.3
human 11.2
device 10.7
wind instrument 10.4
newspaper 10.1
lifestyle 10.1
model 10.1
dance 9.8
interior 9.7
style 9.6
musical instrument 9.6
professional 9.5
product 9.4
happiness 9.4
life 9.3
brass 9.3
music 9.1
girls 9.1
equipment 9
body 8.8
bride 8.6
sport 8.4
hand 8.3
health 8.3
suit 8.2
danger 8.2
dress 8.1
art 8
mask 8
hair 7.9
dark 7.5
active 7.5
action 7.4
room 7.4
wedding 7.4
business 7.3
patient 7.3
sensuality 7.3
exercise 7.3
sexy 7.2
home 7.2
women 7.1
face 7.1
work 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 99
horse 97.6
outdoor 96.2
standing 80.2
animal 62.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 51-69
Gender Male, 90.8%
Calm 89.5%
Sad 4.7%
Surprised 3.1%
Confused 1.3%
Happy 0.5%
Angry 0.4%
Fear 0.4%
Disgusted 0.2%

AWS Rekognition

Age 39-57
Gender Female, 79.3%
Calm 77.2%
Sad 20%
Surprised 0.7%
Angry 0.6%
Happy 0.6%
Confused 0.5%
Fear 0.2%
Disgusted 0.1%

Feature analysis

Amazon

Person 99.7%
Horse 98.7%

Categories

Imagga

paintings art 99.9%

Text analysis

Amazon

19033
19033,
O
АМТААЗ
the АМТААЗ
the
CAREWELL

Google

19033. 19033 ,
19033.
19033
,