Human Generated Data

Title

Two Women Seated on a Cart Pushed by a Servant

Date

c. 1857 - c. 1874

People

-

Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.79

Human Generated Data

Title

Two Women Seated on a Cart Pushed by a Servant

Date

c. 1857 - c. 1874

Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.79

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.1
Human 99.1
Samurai 98.6
Person 97.1
Person 94.4
Musician 72.1
Musical Instrument 72.1
Photography 60.5
Photo 60.5

Clarifai
created on 2023-10-25

people 100
adult 99.5
two 99.1
one 99
man 98.4
group 98.3
military 98.3
portrait 97.3
art 96.9
weapon 96.5
leader 96.3
wear 95.2
war 94
illustration 93
veil 92.9
woman 92.4
soldier 92.4
administration 90.1
print 88.9
cavalry 88.4

Imagga
created on 2022-01-09

puck 49.6
disk 38.9
globe 38
circle 37
earth 35.7
planet 35.1
world 33.8
map 29.9
phonograph record 25.2
global 23.7
sphere 22.1
space 21.7
money 18.7
continent 18.5
geography 16.4
currency 16.2
finance 15.2
black 15
round 14.7
universe 14.6
coin 14.3
glass 14.1
moon 14
financial 13.4
economy 13
cash 12.8
astronomy 12.7
light 12.7
clock 12.7
ocean 12.5
north 12.4
business 12.2
land 12
container 11.8
china 11.7
symbol 11.5
icon 11.1
sea 11
graphic 10.9
stars 10.4
south 10.3
light bulb 10.1
3d 10.1
wall clock 9.9
ceramic ware 9.8
cup 9.8
sun 9.7
coins 9.7
east 9.3
science 8.9
night 8.9
button 8.8
country 8.8
dollar 8.4
sky 8.3
digital 8.1
electric lamp 8.1
surface 7.9
shiny 7.9
atlas 7.8
color 7.8
exchange 7.6
clouds 7.6
design 7.5
one 7.5
porcelain 7.4
close 7.4
metal 7.2
timepiece 7.2
market 7.1
travel 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

book 99.7
text 99.7
person 95.2
black 91.5
man 87.2
white 75.2
drawing 74.1
cartoon 70.6
sketch 64.5
old 64
clothing 60.2
vintage 59.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-39
Gender Female, 99.7%
Calm 92.1%
Confused 4.3%
Sad 1.4%
Angry 1.2%
Surprised 0.3%
Disgusted 0.2%
Happy 0.2%
Fear 0.2%

AWS Rekognition

Age 22-30
Gender Female, 100%
Calm 80.7%
Fear 6.6%
Sad 5%
Angry 2.6%
Confused 1.9%
Surprised 1.6%
Disgusted 1%
Happy 0.8%

AWS Rekognition

Age 41-49
Gender Male, 99.7%
Confused 35.9%
Sad 30.5%
Angry 14%
Calm 9.8%
Disgusted 4.1%
Fear 2.7%
Surprised 2.3%
Happy 0.7%

Microsoft Cognitive Services

Age 33
Gender Female

Microsoft Cognitive Services

Age 33
Gender Female

Microsoft Cognitive Services

Age 17
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Categories

Imagga

paintings art 100%