Human Generated Data

Title

What I Saw in Egypt

Date

c. 1880

People

Artist: Various Artists,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1985.30

Human Generated Data

Title

What I Saw in Egypt

People

Artist: Various Artists,

Date

c. 1880

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1985.30

Machine Generated Data

Tags

Amazon
created on 2019-11-11

Building 99.4
Architecture 99.4
Horse 98
Mammal 98
Animal 98
Horse 95.2
Person 93.9
Human 93.9
Pyramid 91.3
Person 90.5
Painting 79.2
Art 79.2
Person 78.6
Person 71.6

Clarifai
created on 2019-11-11

pyramid 99.8
camel 99.8
people 99.8
print 99.6
illustration 98.7
group 98.7
Pharaoh 98.6
adult 98.6
art 98.3
man 98.1
mammal 97.5
lithograph 97.2
two 96.1
wear 96
seated 95.5
desert 95.2
woodcut 94.7
one 93.6
grave 93.6
painting 93.1

Imagga
created on 2019-11-11

ruler 83.5
pyramid 62
desert 47.4
sand 44.9
history 35.8
tourism 33.8
travel 33.1
stone 31.2
ancient 31.2
sky 30.6
architecture 29.1
monument 26.2
landmark 25.3
pharaoh 22.7
grave 22.3
tomb 21.6
great 20.1
famous 19.6
building 19
old 18.8
tourist 18.1
landscape 17.9
pyramids 16.8
archeology 16.7
rock 16.5
culture 16.3
museum 16.1
sphinx 14.8
vacation 14.7
summer 14.2
art 13.8
ruins 13.6
ruin 13.6
statue 13.4
tower 13.4
historic 12.8
earth 12.8
civilization 12.8
mountain 12.5
historical 12.2
place 12.1
city 11.6
heritage 11.6
temple 11.5
soil 11.4
east 11.2
dry 11.1
sculpture 10.8
dune 10.7
past 10.7
camel 10.6
outdoors 10.5
wall 10.4
stones 10.4
construction 10.3
clouds 10.1
shape 9.7
castle 9.6
mystery 9.6
depository 9.4
sunset 9
deserts 8.9
antique 8.7
middle 8.6
sunny 8.6
rocks 8.5
hill 8.4
outdoor 8.4
fortress 7.9
tour 7.7
destination 7.5
national 7.2
world 7.2
religion 7.2
facility 7.1
scenic 7

Google
created on 2019-11-11

Pyramid 92.9
Picture frame 76.7
Art 69.5
Photography 62.4
Illustration 54.5
Square 52.2

Microsoft
created on 2019-11-11

text 96.9
camel 94
gallery 90.7
horse 84.2
desert 83
arabian camel 66.3
animal 66
mammal 64.6
pyramid 57.7
old 51.1
room 45.6
envelope 33.6
picture frame 19.4
stone 6.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 44-62
Gender Male, 50.4%
Happy 49.6%
Confused 49.5%
Surprised 50.1%
Fear 49.6%
Disgusted 49.5%
Angry 49.5%
Sad 49.5%
Calm 49.8%

AWS Rekognition

Age 36-52
Gender Female, 87.8%
Happy 32.7%
Angry 1.9%
Calm 53.4%
Confused 0.5%
Fear 0.4%
Disgusted 1.3%
Sad 8.4%
Surprised 1.4%

AWS Rekognition

Age 51-69
Gender Male, 50.4%
Fear 49.6%
Sad 49.6%
Disgusted 49.5%
Happy 49.5%
Confused 49.6%
Calm 50%
Angry 49.6%
Surprised 49.6%

AWS Rekognition

Age 50-68
Gender Male, 50.5%
Sad 49.7%
Disgusted 49.6%
Angry 49.8%
Confused 49.5%
Fear 49.8%
Happy 49.5%
Surprised 49.5%
Calm 49.6%

Feature analysis

Amazon

Horse 98%
Person 93.9%
Painting 79.2%

Text analysis

Amazon

Cug
GUCH Cug
GUCH