Human Generated Data

Title

Untitled (street scene with workmen, Africa)

Date

1910s

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3179

Human Generated Data

Title

Untitled (street scene with workmen, Africa)

People

Artist: Unidentified Artist,

Date

1910s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3179

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 98.4
Human 98.4
Person 98.2
Person 98
Wheel 97.8
Machine 97.8
Person 96.2
Wheel 96
Wheel 95.9
Wheel 95.3
Wheel 95.3
Automobile 92.9
Vehicle 92.9
Transportation 92.9
Wheel 92
Person 90.5
Person 88.7
Person 87.8
Bench 87.1
Furniture 87.1
Person 83.8
Building 81.5
Truck 81.3
Model T 79.1
Antique Car 79.1
Person 77.2
Tire 75.6
Factory 71.1
Car 67.6
Workshop 65.1
Spoke 63.8
Wood 63
Axle 57.6
Car Wheel 57.1

Clarifai
created on 2023-10-25

people 99.9
vehicle 99.1
group together 99.1
transportation system 98.7
adult 98.1
street 97.7
many 92.6
group 92.3
man 91.6
monochrome 90.9
several 89.3
war 88.6
truck 88.2
car 87
soldier 85.4
cart 82.4
military vehicle 81.9
two 81.9
driver 80.9
wagon 79.6

Imagga
created on 2022-01-08

barbershop 100
shop 100
mercantile establishment 100
place of business 68.7
establishment 34.3
old 23.7
transportation 17
architecture 16.4
building 15.2
travel 14.1
street 13.8
chair 13.5
city 13.3
vehicle 13.3
transport 12.8
outdoors 12.7
house 12.5
park 12.3
wheel 12.3
outdoor 10.7
horse 10.4
industry 10.2
machine 10
road 9.9
vintage 9.9
furniture 9.7
urban 9.6
antique 9.5
town 9.3
stone 9.3
carriage 9.2
industrial 9.1
cart 9
vacation 9
wheels 8.8
man 8.7
dirt 8.6
seat 8.6
brick 8.5
wood 8.3
sky 8.3
work 8
home 8
bench 8
machinery 7.8
outside 7.7
drive 7.6
landscape 7.4
wheelchair 7.4
male 7.1
wooden 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

ground 96.5
land vehicle 96.1
wheel 96
outdoor 94.9
vehicle 93.9
old 90.5
cart 85.5
drawn 75.8
carriage 67.7
tire 60
pulling 32.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 98.5%
Happy 87.5%
Calm 2.8%
Sad 2.7%
Confused 1.9%
Surprised 1.7%
Disgusted 1.3%
Fear 1.1%
Angry 1%

AWS Rekognition

Age 23-31
Gender Male, 87.4%
Calm 79%
Sad 19.4%
Surprised 0.4%
Confused 0.3%
Disgusted 0.3%
Fear 0.2%
Angry 0.2%
Happy 0.1%

AWS Rekognition

Age 16-24
Gender Male, 76.5%
Sad 37.1%
Fear 23.3%
Happy 17.3%
Calm 10.7%
Confused 4.4%
Angry 3.5%
Disgusted 2%
Surprised 1.7%

AWS Rekognition

Age 12-20
Gender Male, 60.2%
Calm 95.6%
Sad 2.1%
Angry 1.5%
Happy 0.3%
Confused 0.2%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 11-19
Gender Male, 80.4%
Calm 84.9%
Sad 4.7%
Fear 3.5%
Disgusted 2.4%
Angry 1.8%
Confused 1.2%
Surprised 0.7%
Happy 0.7%

AWS Rekognition

Age 20-28
Gender Male, 61.6%
Happy 66.3%
Calm 17.6%
Sad 9%
Angry 2.6%
Confused 1.6%
Fear 1.1%
Surprised 1%
Disgusted 0.9%

AWS Rekognition

Age 25-35
Gender Male, 54%
Calm 94.8%
Sad 1.8%
Angry 0.9%
Confused 0.7%
Disgusted 0.6%
Happy 0.5%
Fear 0.4%
Surprised 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.4%
Wheel 97.8%
Bench 87.1%
Truck 81.3%
Car 67.6%