Human Generated Data

Title

Untitled (male figures in street)

Date

c. 1880-c. 1889

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.377.19

Human Generated Data

Title

Untitled (male figures in street)

People

Artist: Unidentified Artist,

Date

c. 1880-c. 1889

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.377.19

Machine Generated Data

Tags

Amazon
created on 2022-06-03

Person 99.8
Human 99.8
Person 99.7
Person 99.6
Wheel 98.9
Machine 98.9
Person 97.1
Art 92.7
Nature 77.9
Painting 77.3
Outdoors 65.3
Transportation 64.4
Vehicle 62.9

Clarifai
created on 2023-10-30

people 99.8
transportation system 99.3
two 99
cart 98.8
man 98.1
adult 98
art 97.4
print 96.9
seated 96.4
group 95.9
travel 95.7
woman 95.6
desert 94.4
vehicle 94.2
military 93.5
wear 93.2
child 92.9
cavalry 92.8
one 92.7
mammal 92.7

Imagga
created on 2022-06-03

cart 46.4
horse cart 37.2
wagon 31.3
bicycle 29.5
bike 25.4
wheeled vehicle 24
old 20.9
transportation 20.6
carriage 20.2
travel 19.7
transport 19.2
horse 19
vehicle 18.8
wheel 17.9
street 17.5
cycle 15.6
beach 15.6
man 15.5
city 15
vacation 13.9
sand 13.4
sport 13.3
summer 12.9
road 12.6
ride 12.6
tourism 12.4
people 12.3
antique 12.2
seat 12.1
ancient 12.1
outdoors 11.9
jinrikisha 11.9
riding 11.7
male 11.3
sky 10.8
vintage 10.8
outdoor 10.7
tourist 10.5
sea 10.2
aged 10
urban 9.6
window 9.5
animal 9.4
water 9.3
person 9.2
historic 9.2
cleaning implement 9
recreation 9
farm 8.9
architecture 8.7
holiday 8.6
desert 8.6
building 8.3
active 8.1
sun 8
history 8
wheelchair 8
shed 7.9
cycling 7.9
day 7.8
sunny 7.8
tropical 7.7
graffito 7.5
leisure 7.5
landscape 7.4
decoration 7.4
retro 7.4
wall 7.3
landmark 7.2
rural 7

Google
created on 2022-06-03

Wheel 94.1
Window 85.8
Art 85.2
Painting 83.3
Working animal 82
Sky 80.8
Building 75.7
Cart 75.4
Pole 72.8
Illustration 72.5
Paint 72.1
Vintage clothing 71
Vehicle 70
Pack animal 69.8
Carriage 69.4
Visual arts 68.7
Watercolor paint 67.6
Drawing 67
History 66.3
Paper product 62.2

Microsoft
created on 2022-06-03

outdoor 96.5
text 89.1
drawing 88.6
painting 87.4
person 80
horse 60.1
horse-drawn vehicle 58.8
cart 58.3
carriage 47.9
drawn 40.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Male, 99.8%
Sad 99.8%
Calm 11.7%
Confused 8.3%
Fear 7.8%
Surprised 6.7%
Disgusted 2.1%
Angry 0.8%
Happy 0.4%

AWS Rekognition

Age 16-22
Gender Female, 54.8%
Sad 99.9%
Calm 19.6%
Surprised 6.4%
Fear 5.9%
Angry 2.7%
Happy 1.5%
Confused 0.6%
Disgusted 0.3%

AWS Rekognition

Age 19-27
Gender Male, 76.6%
Calm 92.4%
Surprised 6.4%
Fear 5.9%
Sad 4.7%
Angry 0.7%
Confused 0.3%
Disgusted 0.3%
Happy 0.2%

AWS Rekognition

Age 23-31
Gender Male, 94.5%
Sad 91.3%
Calm 52.1%
Surprised 6.5%
Fear 6.3%
Confused 1.2%
Disgusted 0.8%
Angry 0.7%
Happy 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Wheel 98.9%
Painting 77.3%

Categories

Imagga

paintings art 95%
beaches seaside 3.9%