Human Generated Data

Title

2 Horses

Date

?

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the Massachusetts Bay Transportation Authority, Boston Transit Collection, 5.2002.656

Human Generated Data

Title

2 Horses

People

Artist: Unidentified Artist,

Date

?

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the Massachusetts Bay Transportation Authority, Boston Transit Collection, 5.2002.656

Machine Generated Data

Tags

Amazon
created on 2022-06-04

Person 99.7
Human 99.7
Horse 98
Animal 98
Mammal 98
Vehicle 88.2
Transportation 88.2
Horse 87.9
Person 61.5
Spoke 60.2
Machine 60.2
Clothing 59.2
Apparel 59.2
Carriage 58.5
Horse Cart 57.4
Wagon 57.4

Imagga
created on 2022-06-04

graffito 63.7
freight car 46
decoration 45.5
car 35.3
wheeled vehicle 28.2
art 21.4
old 19.5
sculpture 19
vehicle 18.5
religion 15.2
history 15.2
black 14.4
architecture 14.1
statue 12.5
god 12.4
ancient 12.1
culture 12
church 11.1
religious 10.3
man 10.1
detail 9.6
conveyance 9.6
antique 9.5
people 9.5
model 9.3
head 9.2
face 9.2
dark 9.2
city 9.1
water 8.7
holy 8.7
spiritual 8.6
person 8.5
historical 8.5
building 8.4
body 8
portrait 7.8
marble 7.7
wall 7.7
faith 7.7
design 7.6
one 7.5
historic 7.3
metal 7.2
landmark 7.2

Google
created on 2022-06-04

Microsoft
created on 2022-06-04

horse 99.2
outdoor 95.6
transport 87.4
animal 86.3
text 77.2
horse-drawn vehicle 69.5
old 49.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 36-44
Gender Male, 99.9%
Calm 92.9%
Surprised 6.5%
Fear 6%
Sad 3.5%
Confused 1.6%
Angry 0.7%
Disgusted 0.5%
Happy 0.2%

Feature analysis

Amazon

Person 99.7%
Horse 98%

Categories

Imagga

paintings art 98.7%