Human Generated Data

Title

Untitled (panorama of Carro)

Date

1890s

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Dr. Robert Drapkin, 2.2002.2734

Human Generated Data

Title

Untitled (panorama of Carro)

People

Artist: Unidentified Artist,

Date

1890s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Pedestrian 99.6
Human 99.6
Person 99.4
Person 99.1
Person 98.9
Person 98.8
Person 98.5
Person 98.1
Person 97.9
Person 97.7
Person 97.4
Road 97.3
Person 97.3
Building 97
Street 97
Urban 97
City 97
Town 97
Person 96.1
Person 95.8
Person 95.7
Person 93.9
Person 92.6
Asphalt 91.9
Tarmac 91.9
Person 89.8
Person 88.3
Person 88
Person 86.5
Person 85.5
Person 84.4
Path 82.1
Painting 80.1
Art 80.1
Clothing 78.6
Apparel 78.6
People 77.4
Person 74
Person 72.7
Zebra Crossing 71.3
Person 66
Person 65.7
Downtown 64.9
Person 63.5
Drawing 63.4
Person 63.4
Photography 61.3
Photo 61.3
House 60.9
Mansion 60.9
Housing 60.9
Person 60.7
Crowd 59.8
Architecture 59.6
Duel 58.5
Alley 57.2
Alleyway 57.2
Walking 55.9
Outdoors 55.2
Person 52
Person 50

Imagga
created on 2022-01-08

architecture 50.2
column 40.9
building 34.5
stucco 31.3
city 25.8
old 25.1
travel 24.6
stone 22.8
landmark 22.6
marble 22.4
history 22.4
tourism 20.6
hall 20.2
arch 19.9
ancient 19.9
interior 17.7
tourist 17.4
sculpture 16.9
monument 16.8
famous 16.7
art 15.3
wall 15
historic 14.7
house 14.2
historical 14.1
floor 13.9
palace 13.9
street 13.8
columns 13.7
culture 13.7
structure 13
museum 13
statue 12.6
facade 11.9
attraction 11.5
urban 11.4
church 11.1
antique 10.8
entrance 10.6
window 10.2
exterior 10.1
design 10.1
fountain 10
pillar 9.8
depository 9.8
architectural 9.6
sketch 9.4
place 9.3
town 9.3
roman 9.1
room 9.1
religion 9
facility 8.9
detail 8.8
court 8.8
light 8.7
scene 8.7
station 8.5
business 7.9
corridor 7.9
government 7.8
grand 7.8
luxury 7.7
door 7.7
capital 7.6
alley 7.5
temple 7.5
inside 7.4
new 7.3
people 7.3
transportation 7.2
indoors 7
modern 7
decoration 7

Google
created on 2022-01-08

Headgear 82.2
Art 82.1
Adaptation 79.3
Tints and shades 77.2
Vintage clothing 74.3
Event 69.7
Room 69.7
Visual arts 69.7
Building 69.2
Paper product 67.8
Suit 67.7
History 67.4
Pedestrian 67.1
Crowd 66.5
Illustration 65
Font 65
Street 63.6
Stock photography 63.4
Tree 62.6
City 60.6

Microsoft
created on 2022-01-08

horse 95.4
outdoor 93.7
person 85.7
snow 85.6
clothing 82.6
old 75.1
man 64.7
tree 61.6
text 60.4
street 56.3
building 51.2
drawn 47.5
several 11.5

Face analysis

Amazon

AWS Rekognition

Age 21-29
Gender Female, 75.4%
Calm 89.1%
Sad 6.1%
Confused 1.6%
Happy 1.3%
Fear 0.9%
Disgusted 0.4%
Surprised 0.3%
Angry 0.3%

AWS Rekognition

Age 19-27
Gender Female, 59%
Calm 97.5%
Happy 0.8%
Angry 0.8%
Sad 0.7%
Confused 0.1%
Surprised 0.1%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 10-18
Gender Female, 55.2%
Calm 96.7%
Sad 1%
Happy 0.9%
Angry 0.5%
Confused 0.4%
Surprised 0.2%
Fear 0.2%
Disgusted 0.2%

AWS Rekognition

Age 21-29
Gender Male, 98%
Calm 97.8%
Sad 0.7%
Happy 0.5%
Angry 0.4%
Confused 0.3%
Disgusted 0.2%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 16-22
Gender Male, 95.3%
Calm 70.3%
Sad 10.5%
Angry 6.3%
Confused 4.3%
Disgusted 3.4%
Surprised 1.9%
Happy 1.7%
Fear 1.6%

Feature analysis

Amazon

Person 99.4%
Painting 80.1%

Captions

Microsoft

a group of people riding on the back of a horse drawn carriage 60.5%
a group of people in an old photo of a horse 60.4%
a group of people riding on the back of a horse 60.3%

Text analysis

Amazon

ZARRAC