Human Generated Data

Title

Untitled (two men standing next to a girl on a pony)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10490

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two men standing next to a girl on a pony)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10490

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.7
Human 99.7
Person 99.7
Horse 99.1
Mammal 99.1
Animal 99.1
Person 98.9
Person 98.1
Clothing 93.1
Apparel 93.1
Hat 81
Hat 69.5
Female 60.6

Clarifai
created on 2023-10-25

people 99.8
cavalry 98.9
group 98.4
monochrome 98.2
man 97.8
group together 97.7
adult 93.3
many 92.8
woman 91.3
seated 90.3
transportation system 90.2
mammal 90.2
child 88.4
street 85.9
several 85.8
police 85.2
wear 83.1
crowd 80.4
administration 80.1
recreation 79.8

Imagga
created on 2022-01-09

people 25.1
man 24.8
adult 23.7
person 22.2
male 17
women 15.8
city 15.8
men 15.4
urban 14.8
clothing 14.5
black 14
horse 13.8
umbrella 13.6
human 13.5
business 13.4
portrait 11.6
two 11
world 10.9
life 10.8
mask 10.7
couple 10.4
activity 9.8
horses 9.7
active 9.7
crowd 9.6
standing 9.6
walking 9.5
happiness 9.4
lifestyle 9.4
model 9.3
face 9.2
street 9.2
outdoor 9.2
hand 9.1
animal 9.1
attractive 9.1
holding 9.1
nurse 9
group 8.9
suit 8.6
smile 8.5
travel 8.4
leisure 8.3
fashion 8.3
device 8.2
pose 8.1
sexy 8
businessman 7.9
work 7.8
hat 7.5
rain 7.5
photographer 7.5
occupation 7.3
protection 7.3
industrial 7.3
equipment 7.2
transportation 7.2
love 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 99.5
horse 97.3
man 96.7
outdoor 95.6
animal 91
standing 88
text 86.2
clothing 58.4
black and white 53.5
posing 43.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Female, 59.4%
Calm 51.3%
Sad 40%
Surprised 2.7%
Confused 2.2%
Happy 1.4%
Disgusted 0.9%
Fear 0.7%
Angry 0.7%

AWS Rekognition

Age 25-35
Gender Male, 71.4%
Calm 49.8%
Happy 43.9%
Surprised 2.1%
Sad 1.4%
Angry 1.2%
Disgusted 0.7%
Confused 0.5%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Horse 99.1%
Hat 81%

Categories

Text analysis

Amazon

SHOW
TRANSPORTATION
STERN TRANSPORTATION
STERN
STOCK SHOW
by
STOCK
of
eabled by
eabled
MJ17--YT37--
CCCO

Google

MJI7--YT 3RA°2--AGOX
MJI7--YT
3RA°2--AGOX