Human Generated Data

Title

Untitled (street scene, Africa)

Date

1910s

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3185

Human Generated Data

Title

Untitled (street scene, Africa)

People

Artist: Unidentified Artist,

Date

1910s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3185

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Wood 98.2
Person 98
Human 98
Person 97.7
Person 97.6
Person 96.4
Metropolis 94.8
Building 94.8
Urban 94.8
City 94.8
Town 94.8
Wheel 94.5
Machine 94.5
Wheel 94
Person 93.6
Person 88.8
Wheel 88.7
Vehicle 86.1
Transportation 86.1
Wheel 84.6
Wheel 83.4
Person 81.6
Bench 81
Furniture 81
Plywood 78.3
Spoke 77.7
Person 70.8
Tire 68.9
Wheel 65.4
Truck 64.3
Car 62
Automobile 62
Clothing 59.6
Apparel 59.6
Workshop 58.2
Alloy Wheel 56.8

Clarifai
created on 2023-10-25

people 99.9
transportation system 98.9
vehicle 98.5
group together 98.4
adult 96.1
street 95.8
many 94.3
group 92.9
wagon 92.5
train 89.4
man 87.8
cart 87.6
several 87.1
monochrome 86.9
truck 86.3
railway 85.8
war 81.8
cavalry 80.2
two 77.4
soldier 76.3

Imagga
created on 2022-01-08

barbershop 100
shop 100
mercantile establishment 86.7
place of business 57.8
establishment 28.9
old 25.8
chair 19.3
vehicle 17.6
cart 16.4
house 15.9
transportation 15.2
horse cart 15.2
furniture 13.5
wood 13.3
architecture 13.3
wagon 12.9
building 12.2
seat 12.1
travel 12
industry 11.9
transport 11.9
horse 11.4
park 10.7
carriage 10.6
antique 10.5
landscape 10.4
stone 10.1
industrial 10
wheeled vehicle 9.8
machine 9.8
work 9.7
interior 9.7
outdoors 9.7
rural 9.7
home 9.6
wheel 9.4
man 9.4
street 9.2
window 9.2
vacation 9
wooden 8.8
machinery 8.8
room 8.7
grass 8.7
barber chair 8.7
construction 8.6
tree 8.5
outdoor 8.4
sky 8.3
vintage 8.3
farm 8
wheels 7.8
male 7.8
truck 7.7
dirt 7.6
city 7.5
tourism 7.4
town 7.4
historic 7.3
tourist 7.2
metal 7.2
color 7.2
bench 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

land vehicle 97.1
wheel 97
vehicle 95.5
outdoor 93.9
old 79.8
truck 72.5
cart 71.3
drawn 65.9
tire 62.5
auto part 59.6
carriage 57.7
several 10.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-33
Gender Male, 97.9%
Sad 28.4%
Happy 19.9%
Calm 17.9%
Confused 14.1%
Angry 6.9%
Surprised 6%
Disgusted 5.1%
Fear 1.8%

AWS Rekognition

Age 24-34
Gender Male, 99.4%
Sad 72.4%
Calm 22.9%
Fear 1%
Confused 0.9%
Surprised 0.8%
Disgusted 0.7%
Angry 0.7%
Happy 0.6%

AWS Rekognition

Age 19-27
Gender Female, 50.3%
Calm 49.7%
Sad 26.8%
Fear 7.8%
Happy 6.7%
Confused 5.3%
Angry 1.5%
Disgusted 1.1%
Surprised 1%

AWS Rekognition

Age 25-35
Gender Female, 79.2%
Calm 98.2%
Happy 0.6%
Sad 0.3%
Disgusted 0.3%
Fear 0.3%
Angry 0.2%
Confused 0.1%
Surprised 0.1%

AWS Rekognition

Age 16-22
Gender Male, 76.4%
Calm 99.1%
Happy 0.6%
Sad 0.1%
Confused 0.1%
Fear 0.1%
Disgusted 0.1%
Surprised 0%
Angry 0%

AWS Rekognition

Age 28-38
Gender Female, 93.1%
Calm 94.6%
Fear 2.7%
Sad 0.8%
Happy 0.8%
Angry 0.5%
Disgusted 0.3%
Confused 0.2%
Surprised 0.2%

AWS Rekognition

Age 23-33
Gender Male, 74.2%
Calm 98.3%
Sad 0.6%
Happy 0.3%
Confused 0.2%
Disgusted 0.2%
Fear 0.1%
Angry 0.1%
Surprised 0.1%

Feature analysis

Amazon

Person 98%
Wheel 94.5%
Bench 81%
Truck 64.3%