Human Generated Data

Title

Untitled (street scene with workmen, Africa)

Date

1910s

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3184

Human Generated Data

Title

Untitled (street scene with workmen, Africa)

People

Artist: Unidentified Artist,

Date

1910s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Wheel 99.3
Machine 99.3
Wheel 99.1
Model T 98.7
Antique Car 98.7
Automobile 98.7
Car 98.7
Vehicle 98.7
Transportation 98.7
Person 98.3
Human 98.3
Tire 98
Person 96.4
Wheel 96.1
Person 96.1
Person 95
Person 94.4
Person 93.3
Person 91.9
Spoke 87.1
Person 86.8
Person 85.5
Person 79.5
Car Wheel 79
Person 71.9
Wheel 70.1
Person 69.2
Person 67.6
Alloy Wheel 57.5

Imagga
created on 2022-01-08

vehicle 72.5
half track 39.7
machine 38.2
wheeled vehicle 34.2
military vehicle 32.5
tracked vehicle 32.1
truck 31.5
transportation 28.7
seller 24
old 23.7
car 22.2
transport 21.9
wheel 20.8
thresher 19.3
machinery 18.5
wagon 17.4
industry 17.1
cart 16.8
industrial 16.3
conveyance 16.1
work 16
equipment 15.7
construction 15.4
farm machine 15.4
heavy 15.3
grass 15
tire 14.7
village 14.5
road 14.5
tractor 13.8
horse cart 13.7
auto 13.4
dirt 13.4
building 13
motor vehicle 12.9
wheels 12.7
farm 12.5
rural 12.3
dirty 11.7
driving 11.6
sky 11.5
yellow 11.3
antique 10.9
house 10.9
hay 10.6
rock 10.4
architecture 10.2
outdoor 9.9
travel 9.9
landscape 9.7
drive 9.5
site 9.4
device 9.3
iron 9.3
metal 8.9
dust 8.8
motor 8.7
military 8.7
engine 8.7
automobile 8.6
trailer 8.6
tree 8.6
environment 8.2
working 8
sand 7.9
abandoned 7.8
structure 7.8
cargo 7.8
track 7.7
garbage truck 7.7
power 7.6
field 7.5
outdoors 7.5
lorry 7.4
mine 7.1
job 7.1
wooden 7
agriculture 7

Google
created on 2022-01-08

Wheel 98.1
Tire 97.2
Vehicle 95.8
Motor vehicle 91.8
Building 87.5
Automotive tire 87.1
Mode of transport 85.4
Classic 75.4
Car 73.5
Tread 73.1
Automotive wheel system 67.9
Auto part 67.1
Sky 66.1
Fender 64.8
History 64.1
House 60.3
Suit 57.6
Tire care 56.9
Transport 52.1
Collectable 51.3

Microsoft
created on 2022-01-08

outdoor 99.3
land vehicle 98.6
wheel 98.5
vehicle 96.4
tire 86.4
old 74.5
cart 72.1
auto part 67.2
tractor 59.9

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 99.9%
Calm 53.5%
Sad 37.4%
Angry 4.3%
Fear 2.1%
Confused 1.1%
Happy 0.6%
Disgusted 0.5%
Surprised 0.5%

AWS Rekognition

Age 25-35
Gender Male, 100%
Sad 52.4%
Disgusted 45.1%
Confused 0.9%
Calm 0.6%
Surprised 0.4%
Angry 0.2%
Happy 0.2%
Fear 0.1%

AWS Rekognition

Age 41-49
Gender Male, 100%
Sad 92.2%
Calm 2.8%
Happy 2.1%
Confused 1.4%
Surprised 0.5%
Disgusted 0.3%
Fear 0.3%
Angry 0.3%

AWS Rekognition

Age 23-31
Gender Male, 98%
Sad 83.4%
Disgusted 5.1%
Fear 4.8%
Calm 2.7%
Angry 1.7%
Confused 1%
Surprised 0.7%
Happy 0.6%

AWS Rekognition

Age 23-31
Gender Male, 100%
Sad 92%
Angry 2.8%
Calm 1.6%
Fear 1.6%
Confused 0.8%
Disgusted 0.6%
Surprised 0.4%
Happy 0.1%

AWS Rekognition

Age 24-34
Gender Male, 99.7%
Sad 28.9%
Angry 24.3%
Calm 22.4%
Fear 12%
Confused 4.6%
Disgusted 4%
Surprised 3%
Happy 0.8%

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Happy 23%
Calm 18.5%
Sad 18%
Disgusted 17.1%
Fear 12%
Angry 5.2%
Surprised 4.3%
Confused 2%

AWS Rekognition

Age 16-22
Gender Male, 85.8%
Sad 34.4%
Confused 26.5%
Fear 25.1%
Calm 6.9%
Disgusted 3.2%
Surprised 1.8%
Angry 1.3%
Happy 0.8%

AWS Rekognition

Age 23-31
Gender Male, 98.3%
Calm 48.3%
Sad 17.3%
Angry 10%
Fear 9.3%
Disgusted 5.3%
Surprised 3.8%
Confused 3.4%
Happy 2.5%

AWS Rekognition

Age 36-44
Gender Male, 72.2%
Disgusted 83.5%
Sad 7.6%
Fear 2.1%
Confused 1.9%
Happy 1.3%
Surprised 1.2%
Angry 1.2%
Calm 1.1%

AWS Rekognition

Age 24-34
Gender Male, 99.4%
Fear 66%
Calm 21.4%
Surprised 3.8%
Sad 3.7%
Disgusted 2%
Confused 1.2%
Angry 1.2%
Happy 0.7%

AWS Rekognition

Age 34-42
Gender Male, 99.9%
Calm 62.1%
Sad 24.5%
Surprised 3.9%
Happy 3.1%
Angry 2%
Confused 1.7%
Disgusted 1.7%
Fear 1%

AWS Rekognition

Age 20-28
Gender Female, 69.5%
Sad 39.1%
Calm 24.1%
Happy 21.4%
Angry 7.5%
Fear 4.6%
Disgusted 1.8%
Surprised 0.8%
Confused 0.5%

AWS Rekognition

Age 24-34
Gender Male, 52.3%
Calm 83.9%
Sad 11.9%
Fear 2%
Happy 0.7%
Confused 0.6%
Disgusted 0.4%
Angry 0.3%
Surprised 0.2%

AWS Rekognition

Age 16-24
Gender Male, 98.1%
Calm 44.9%
Sad 41.6%
Angry 5.5%
Happy 2.1%
Disgusted 2.1%
Fear 1.5%
Confused 1.2%
Surprised 1.1%

AWS Rekognition

Age 7-17
Gender Male, 69.7%
Sad 62.1%
Happy 29.1%
Fear 3.3%
Calm 2.5%
Angry 1.5%
Surprised 0.8%
Disgusted 0.4%
Confused 0.2%

AWS Rekognition

Age 19-27
Gender Female, 73.3%
Calm 84.6%
Sad 11.3%
Happy 1.9%
Confused 0.7%
Disgusted 0.6%
Fear 0.4%
Surprised 0.3%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Wheel 99.3%
Person 98.3%

Captions

Microsoft

a group of people in an old photo of a truck 91.9%
an old photo of a truck 91.8%
a group of people standing in front of a truck 90.2%