Human Generated Data

Title

Untitled (workmen, Africa)

Date

1910s

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3168

Human Generated Data

Title

Untitled (workmen, Africa)

People

Artist: Unidentified Artist,

Date

1910s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3168

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 99.5
Person 99.5
Person 99.5
Person 92.7
Vehicle 92.4
Transportation 92.4
Person 90.7
Car 84.1
Automobile 84.1
Person 83.1
Person 82.1
Truck 79.1
Person 79.1
Wheel 77.7
Machine 77.7
Nature 74.7
Outdoors 66.4
Person 61.4
Train 61.1
Model T 56.9
Antique Car 56.9

Clarifai
created on 2023-10-25

people 100
group together 99.1
transportation system 99
vehicle 98.9
adult 97.8
cavalry 97.6
two 97.2
wagon 95.3
group 95.1
driver 94.7
man 94.3
carriage 91.3
three 91.2
street 90.9
child 88.9
one 88.9
war 88.7
soldier 88.2
wear 87.2
home 86

Imagga
created on 2022-01-08

horse cart 41.4
cart 40.9
wheeled vehicle 34.7
wagon 32
vehicle 30.9
old 20.9
building 16.2
transportation 16.1
truck 15.4
transport 14.6
industry 14.5
machine 14.5
road 14.5
seat 14.3
chair 14.1
carriage 13.8
industrial 13.6
architecture 13.3
wheel 13.2
garbage truck 13.1
street 12.9
work 12.6
rural 12.3
wheelchair 12.2
man 12.1
barbershop 12
city 11.6
bench 11.6
tricycle 11.4
dirt 10.5
landscape 10.4
shop 10.1
tree 10
motor vehicle 9.9
outdoor 9.9
male 9.9
travel 9.9
farm 9.8
antique 9.5
car 9.5
grass 9.5
construction 9.4
house 9.2
tractor 8.8
machinery 8.8
wall 8.6
horse 8.5
outdoors 8.3
equipment 8.3
sky 8.3
history 8
conveyance 7.9
stretcher 7.9
urban 7.9
container 7.9
ancient 7.8
outside 7.7
war 7.7
litter 7.6
drive 7.6
wood 7.5
furniture 7.5
town 7.4
mercantile establishment 7.4
danger 7.3
trailer 7.1
working 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

building 99.4
ground 97.3
land vehicle 96.5
vehicle 95.2
person 86.3
wheel 85.8
car 79.8
transport 79.3
horse-drawn vehicle 78.1
old 71.4
cart 31.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-30
Gender Male, 99.8%
Fear 55.6%
Calm 29.9%
Confused 10.1%
Sad 2.7%
Surprised 0.5%
Happy 0.5%
Disgusted 0.4%
Angry 0.3%

AWS Rekognition

Age 29-39
Gender Male, 99%
Disgusted 43.3%
Happy 17.7%
Sad 15.3%
Confused 15.1%
Calm 2.5%
Angry 2.4%
Surprised 2.1%
Fear 1.5%

AWS Rekognition

Age 23-31
Gender Female, 51.4%
Calm 93.2%
Happy 2.2%
Surprised 2.1%
Disgusted 0.8%
Confused 0.5%
Sad 0.5%
Angry 0.4%
Fear 0.3%

AWS Rekognition

Age 18-26
Gender Male, 99.6%
Calm 87.9%
Happy 3.7%
Sad 3.5%
Angry 1.5%
Confused 1.1%
Surprised 0.9%
Disgusted 0.7%
Fear 0.5%

Feature analysis

Amazon

Person 99.5%
Truck 79.1%
Wheel 77.7%
Train 61.1%

Categories