Human Generated Data

Title

Untitled (girl dressed as rabbit leading a dog)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7639

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (girl dressed as rabbit leading a dog)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 99.4
Person 99.4
Animal 98.2
Mammal 98.2
Dog 98.2
Canine 98.2
Pet 98.2
Person 95.6
Person 93.5
Person 92.9
Person 92.8
Person 87.6
Person 82.4
People 78.5
Person 76.7
Clothing 76.2
Apparel 76.2
Person 75.2
Crowd 72.8
Pedestrian 71.1
Person 70.1
Leisure Activities 63.7
Indoors 61.2
Room 60.8
Bull 57
Person 56.3
Person 51.1
Person 49.9

Imagga
created on 2022-01-08

horse 68
horses 34.1
vaulting horse 27.8
equine 25.7
farm 25
ranch 23.7
rural 19.4
cow 19.1
stallion 18.6
sport 17.8
pasture 17.2
field 16.7
gymnastic apparatus 16.7
animals 16.7
snow 16.6
cattle 16.4
bull 16.1
riding 15.6
brown 15.4
winter 15.3
sunset 15.3
animal 14.8
equestrian 14.7
teacher 14.5
sky 14
saddle 14
black 13.8
sun 12.9
mane 12.7
outdoors 12.7
bovine 12.4
carriage 12.1
cold 12
educator 11.9
mare 11.8
people 11.7
sports equipment 11.6
grass 11.1
horseback 10.8
ride 10.8
fence 10.5
sidesaddle 10.4
landscape 10.4
sunrise 10.3
tree 10
silhouette 9.9
grazing 9.8
country 9.7
paddock 8.9
stable 8.9
rider 8.8
man 8.7
professional 8.4
outdoor 8.4
livestock 8.3
countryside 8.2
morning 8.1
thoroughbred 8
person 7.9
courage 7.9
equipment 7.9
pony 7.9
support 7.8
travel 7.7
fog 7.7
clouds 7.6
placental 7.3
competition 7.3
trainer 7.2
cowboy 7.2
meadow 7.2
seat 7.1
barn 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 94.6
dog 87.5
standing 87.2
animal 68.5
carnivore 58.6

Face analysis

Amazon

Google

AWS Rekognition

Age 4-12
Gender Female, 98.6%
Angry 36.2%
Sad 25.4%
Happy 18.6%
Calm 15.5%
Confused 1.7%
Disgusted 1.2%
Fear 0.8%
Surprised 0.8%

AWS Rekognition

Age 14-22
Gender Female, 92.7%
Happy 88%
Sad 6%
Calm 2.2%
Confused 1.2%
Fear 0.9%
Surprised 0.6%
Disgusted 0.6%
Angry 0.4%

AWS Rekognition

Age 9-17
Gender Male, 99.8%
Calm 44.4%
Sad 44%
Surprised 3.4%
Confused 3%
Fear 2.9%
Angry 0.9%
Happy 0.8%
Disgusted 0.6%

AWS Rekognition

Age 26-36
Gender Male, 75.9%
Disgusted 26.6%
Calm 24.6%
Confused 19.9%
Sad 10.2%
Fear 6.8%
Happy 5.6%
Angry 4.2%
Surprised 2.2%

AWS Rekognition

Age 22-30
Gender Male, 93.8%
Calm 64.1%
Sad 17.8%
Happy 8.2%
Fear 3.5%
Angry 2.8%
Disgusted 1.3%
Confused 1.2%
Surprised 1.1%

AWS Rekognition

Age 20-28
Gender Female, 88%
Calm 27.9%
Sad 24.9%
Confused 22.5%
Angry 8.5%
Surprised 5.6%
Fear 5.2%
Disgusted 3.1%
Happy 2.2%

AWS Rekognition

Age 22-30
Gender Female, 66.6%
Sad 88.5%
Calm 6.4%
Fear 1.5%
Happy 1.1%
Disgusted 1.1%
Angry 0.7%
Confused 0.4%
Surprised 0.2%

AWS Rekognition

Age 22-30
Gender Male, 99.9%
Calm 64.4%
Disgusted 16.7%
Confused 11.7%
Happy 1.9%
Sad 1.8%
Angry 1.4%
Surprised 1.4%
Fear 0.8%

AWS Rekognition

Age 23-31
Gender Male, 91.7%
Calm 96.3%
Sad 1.4%
Happy 1.2%
Confused 0.5%
Surprised 0.2%
Angry 0.2%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 37-45
Gender Male, 89.1%
Sad 99.6%
Confused 0.2%
Calm 0.1%
Fear 0%
Disgusted 0%
Angry 0%
Happy 0%
Surprised 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Dog 98.2%

Captions

Microsoft

a group of people standing in front of a building 88.2%
a group of people standing next to a cow 88.1%
a group of people standing next to a dog 67.5%

Text analysis

Amazon

28467.

Google

28467.
YT3FA2-YAGON
28467. YT3FA2-YAGON