Human Generated Data

Title

Untitled (man and young boy on horses)

Date

1948

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2714

Human Generated Data

Title

Untitled (man and young boy on horses)

People

Artist: Harry Annas, American 1897 - 1980

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-16

Horse 99.1
Animal 99.1
Mammal 99.1
Horse 97.8
Human 93.9
Person 92.5
Equestrian 86.7
Person 85.5
Andalusian Horse 78.8
Apparel 65.9
Clothing 65.9
Horse 63.9
Stallion 59.2
Female 58.1

Imagga
created on 2022-01-16

sunset 26.1
person 25.4
silhouette 24
man 22.9
people 20.6
beach 19.8
adult 19
water 18
horse 17.6
sport 17.6
ocean 15.9
male 15.6
sky 14.7
outdoor 14.5
vaulting horse 14.3
dark 14.2
summer 13.5
outdoors 12.5
sea 12.5
athlete 11.9
portrait 11.6
dusk 11.4
fashion 11.3
sun 11.3
evening 11.2
sand 10.7
vacation 10.6
lady 10.6
boy 10.4
action 10.2
sports equipment 10.1
lifestyle 10.1
model 10.1
protection 10
gymnastic apparatus 9.9
attractive 9.8
sexy 9.6
player 9.3
danger 9.1
exercise 9.1
seat 9.1
dirty 9
fitness 9
dress 9
active 9
style 8.9
mask 8.9
destruction 8.8
body 8.8
protective 8.8
nuclear 8.7
shore 8.6
enjoy 8.5
light 8.3
ballplayer 8.3
leisure 8.3
street 8.3
alone 8.2
dancer 8.2
landscape 8.2
industrial 8.2
posing 8
world 8
radioactive 7.9
couple 7.8
radiation 7.8
performer 7.8
toxic 7.8
travel 7.7
chemical 7.7
run 7.7
gas 7.7
sign 7.5
fun 7.5
one 7.5
safety 7.4
pose 7.2
black 7.2
coast 7.2
recreation 7.2
shadow 7.2

Google
created on 2022-01-16

Microsoft
created on 2022-01-16

text 97.2
outdoor 86.2
animal 75.7
black and white 56.5
horse 14

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Male, 74.2%
Calm 44.8%
Happy 25.1%
Sad 18.8%
Disgusted 4.8%
Surprised 2.9%
Angry 1.5%
Fear 1.2%
Confused 0.9%

AWS Rekognition

Age 48-54
Gender Male, 99.3%
Sad 34%
Confused 24.5%
Calm 12.5%
Happy 12.3%
Surprised 4.8%
Disgusted 4.6%
Fear 4.2%
Angry 3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Horse 99.1%
Person 92.5%

Captions

Microsoft

a person riding a horse 88%
a person standing next to a horse 87.9%
a group of people standing next to a horse 87.8%

Text analysis

Amazon

as

Google

as
T3RA2-XAGO
as T3RA2-XAGO