Human Generated Data

Title

Untitled (couple with horse in field)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2854

Human Generated Data

Title

Untitled (couple with horse in field)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-16

Person 99.6
Human 99.6
Person 98.3
Horse 97.3
Animal 97.3
Mammal 97.3
Dress 93.8
Clothing 93.8
Apparel 93.8
Female 83.5
Face 82.5
Colt Horse 77.6
Outdoors 76
Andalusian Horse 74.4
Girl 70.9
Furniture 70.9
Chair 70.9
Grass 69.4
Plant 69.4
Nature 65.9
Pants 65.5
Portrait 64.3
Photography 64.3
Photo 64.3
Woman 63.8
Kid 63.4
Child 63.4
Person 61.1
Coat 61
Building 59.4
Person 57.9
Spoke 57.7
Machine 57.7
Housing 57.1
Tower 55.8
Architecture 55.8
Steeple 55.8
Spire 55.8
Person 48.7

Imagga
created on 2022-01-16

sport 29.5
person 24.3
man 22.8
people 21.2
adult 20.9
sky 18.5
outdoors 18.3
active 18
outdoor 16.8
sunset 16.2
silhouette 14.9
male 14.9
action 14.8
fitness 14.5
athlete 13.9
exercise 13.6
beach 13.4
lifestyle 12.3
grass 11.9
weapon 11.7
sand 11.7
activity 11.6
planner 11.3
fun 11.2
men 11.2
summer 10.9
leisure 10.8
horse 10.5
freedom 10.1
run 9.6
travel 9.2
competition 9.1
player 9.1
human 9
sports equipment 8.9
sun 8.9
wind instrument 8.8
life 8.8
day 8.6
sword 8.5
portrait 8.4
equipment 8.4
field 8.4
protection 8.2
danger 8.2
landscape 8.2
recreation 8.1
mountain 8
brass 7.9
play 7.8
device 7.7
athletic 7.7
two 7.6
clouds 7.6
adventure 7.6
walking 7.6
healthy 7.6
legs 7.5
performer 7.5
ocean 7.5
sports 7.4
clothing 7.4
vacation 7.4
speed 7.3
black 7.3
cowboy 7.3
body 7.2
musical instrument 7.1
game 7.1
mask 7.1
women 7.1
happiness 7.1

Google
created on 2022-01-16

Horse 97.9
Working animal 88.4
Mammal 85.5
Gesture 85.3
Bridle 85.2
Horse supplies 84.9
Horse tack 84.7
Black-and-white 83.2
Rein 76.9
Pack animal 76.7
Mane 75.4
Tree 74.8
Monochrome photography 74.4
Recreation 72.9
Landscape 72.4
Monochrome 72
Happy 71.7
Mare 71.1
Livestock 70.9
Stallion 65.3

Microsoft
created on 2022-01-16

outdoor 97.6
text 91.2
mammal 72.3
animal 68.3
person 59.1
posing 52.4
horse 15.2

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Female, 88.4%
Calm 98.2%
Sad 0.9%
Happy 0.3%
Angry 0.2%
Surprised 0.1%
Fear 0.1%
Disgusted 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Horse 97.3%

Captions

Microsoft

a person standing next to a horse 93.9%
a group of people standing next to a horse 92.5%
a man and a woman standing next to a horse 89%

Text analysis

Amazon

a
YТЗЗА-ОX

Google

YT33A2
YT33A2