Human Generated Data

Title

Street Restaurant Vendor

Date

c. 1870

People

Artist: William Saunders, British 1832 - 1892

Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.64

Human Generated Data

Title

Street Restaurant Vendor

People

Artist: William Saunders, British 1832 - 1892

Date

c. 1870

Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.64

Machine Generated Data

Tags

Amazon
created on 2023-01-27

Worker 99.6
Person 99.2
Man 99.2
Adult 99.2
Male 99.2
Person 99.1
Man 99.1
Adult 99.1
Male 99.1
Person 99
Man 99
Adult 99
Male 99
Hat 96.1
Clothing 96.1
Coat 94.8
Factory 90.8
Building 90.8
Architecture 90.8
Outdoors 84.2
Shoe 83.1
Footwear 83.1
Photography 82.5
Nature 66.4
Machine 66.1
Manufacturing 65.8
Lamp 57.9
Hat 57.5
Portrait 56.5
Head 56.5
Face 56.5
Gardener 56.2
Gardening 56.2
Garden 56.2
Hospital 56
People 55.9
Cleaning 55.7
Carpenter 55.2
Pump 55.1
Hardhat 55
Helmet 55

Clarifai
created on 2023-10-13

people 99.9
adult 97.9
man 95.8
art 95.3
portrait 94.7
street 93.5
furniture 91.4
painting 90.9
one 90.4
two 89.1
child 88.6
room 87.9
wear 87.1
war 83.4
group 83.1
woman 83.1
administration 80.2
monochrome 79.2
family 78.5
analogue 76.2

Imagga
created on 2023-01-27

shovel 25.1
man 18.1
hand tool 17.1
people 16.7
tool 16.7
device 15.7
washboard 14.9
musical instrument 14.6
old 13.9
crutch 13.7
male 12.8
person 11.9
dark 11.7
sunset 11.7
chair 11.7
stick 11.6
silhouette 11.6
outdoor 11.5
outdoors 11.2
adult 11.1
staff 10.6
work 10.1
dirty 9.9
travel 9.1
protection 9.1
wheelchair 8.8
radio 8.8
water 8.7
beach 8.4
summer 8.4
banjo 8.2
danger 8.2
stringed instrument 8.1
sun 8
kin 8
rural 7.9
country 7.9
seat 7.8
men 7.7
winter 7.7
sky 7.6
sport 7.4
light 7.3
brass 7.3
television 7.3
black 7.2
religion 7.2
farm 7.1
handcart 7.1
working 7.1
agriculture 7

Google
created on 2023-01-27

Microsoft
created on 2023-01-27

person 98.3
text 98.3
clothing 97.4
ground 96.3
man 95.8
outdoor 89.9
old 80.8
black and white 68.7
musical instrument 51.2
vintage 35.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Male, 99.9%
Calm 99%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0.6%
Disgusted 0.1%
Angry 0.1%
Happy 0%

AWS Rekognition

Age 24-34
Gender Male, 99.1%
Calm 89.9%
Sad 8.1%
Surprised 6.3%
Fear 5.9%
Disgusted 0.1%
Happy 0.1%
Angry 0%
Confused 0%

AWS Rekognition

Age 21-29
Gender Male, 86.2%
Sad 99.8%
Calm 28.5%
Surprised 6.3%
Fear 5.9%
Confused 0.3%
Happy 0.3%
Disgusted 0.3%
Angry 0.1%

AWS Rekognition

Age 18-26
Gender Male, 73.1%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 2.1%
Confused 0.1%
Happy 0.1%
Angry 0.1%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Man 99.2%
Adult 99.2%
Male 99.2%
Hat 96.1%
Shoe 83.1%
Lamp 57.9%

Text analysis

Google

}