Human Generated Data

Title

Photo Album

Date

c. 1857 - c. 1874

People

-

Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.63

Human Generated Data

Title

Photo Album

Date

c. 1857 - c. 1874

Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.63

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.5
Human 99.5
Person 98.6
Face 79.2
Wheel 72.6
Machine 72.6
Art 66
Drawing 65.5
Musician 59.6
Musical Instrument 59.6
Clothing 58
Apparel 58
Window 55.5

Clarifai
created on 2023-10-25

people 100
cavalry 99.3
adult 98.9
art 98.7
two 98.4
man 98.4
seated 98.2
mammal 97.4
portrait 97.3
one 96.8
cattle 96.3
group 94.3
child 91.9
vehicle 89.4
veil 89.4
woman 89
print 88.9
painting 88.1
illustration 87.8
transportation system 87.5

Imagga
created on 2022-01-09

radio telescope 61.8
astronomical telescope 49.4
globe 46.3
earth 44.8
planet 42.4
telescope 40.9
world 36.3
map 31.1
global 31
sphere 26.7
space 26.4
ingot 25.3
magnifier 24.6
continent 21.4
geography 20.2
cash 20.1
currency 19.7
block 19.6
money 19.6
finance 16.9
3d 16.3
universe 15.6
financial 15.1
business 14
dollar 13.9
round 13.8
ball 13.6
science 13.3
land 13.3
black 13.2
stars 13.2
atlas 13.1
economy 13
china 12.8
astronomy 12.7
scientific instrument 12.4
graphic 11.7
ocean 11.6
environment 11.5
north 11.5
wealth 10.8
digital 10.5
design 10.5
ceramic ware 10.2
sea 10.2
bank 9.9
surface 9.7
sky 9.6
coin 9.5
moon 9.5
cloud 9.5
light 9.4
glass 9.3
travel 9.2
continents 8.8
circle 8.8
symbol 8.8
paper 8.6
exchange 8.6
clouds 8.5
east 8.4
art 8.2
shape 8.1
reflection 7.9
cartography 7.9
worldwide 7.8
life 7.8
countries 7.8
equipment 7.8
dollars 7.7
coins 7.7
backboard 7.7
texture 7.6
international 7.6
bill 7.6
porcelain 7.5
south 7.5
close 7.4
banking 7.4
sun 7.3
color 7.2
transparent 7.2
satellite 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 98.4
person 97.9
clothing 89.1
man 80.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 99.8%
Calm 71.1%
Sad 13.9%
Surprised 6.4%
Confused 4.4%
Angry 1.6%
Disgusted 1.3%
Fear 1.1%
Happy 0.2%

AWS Rekognition

Age 33-41
Gender Male, 55.8%
Calm 78.3%
Surprised 15.8%
Sad 4.5%
Angry 0.5%
Disgusted 0.4%
Confused 0.2%
Fear 0.2%
Happy 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Wheel 72.6%

Categories

Imagga

paintings art 100%

Captions