Human Generated Data

Title

Untitled (four men standing behind surveyor's tripod, girders and steel frames in background, standing on partially constructed bridge(?))

Date

20th century

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 4.2002.22228

Human Generated Data

Title

Untitled (four men standing behind surveyor's tripod, girders and steel frames in background, standing on partially constructed bridge(?))

People

Artist: Unidentified Artist,

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 4.2002.22228

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 99.7
Human 99.7
Person 99.6
Person 99.1
Person 98.2
Clothing 96.1
Apparel 96.1
Wood 92.6
Handrail 81.1
Banister 81.1
Plywood 79
Helmet 74.6
Tripod 70.8
Building 65.8
Construction 65.1
Hardhat 64
Photography 61.3
Photo 61.3
Portrait 61.2
Face 61.2
Undershirt 59.5
Railing 58.5
Worker 55.6

Clarifai
created on 2023-10-22

people 99.8
group together 98.6
adult 97.6
group 97.3
man 95.3
outfit 91.9
two 91.9
wear 91.4
many 90.7
three 89.9
actor 89.9
monochrome 88.3
several 86.3
woman 85.6
uniform 84.3
construction worker 79.6
vehicle 78.7
music 78
watercraft 77.4
aircraft 76.1

Imagga
created on 2022-03-11

engineer 28.4
musical instrument 27.8
blackboard 21.3
percussion instrument 21
man 18.8
chair 16.3
male 16.3
marimba 14.6
people 13.9
device 13
person 12.5
stringed instrument 12.3
business 12.1
building 12.1
protection 11.8
industrial 10.9
sky 10.8
technology 10.4
table 10.4
adult 9.8
destruction 9.8
room 9.4
construction 9.4
dark 9.2
old 9
dirty 9
stage 9
outdoors 8.9
night 8.9
group 8.9
sun 8.8
architecture 8.6
two 8.5
seat 8.4
folding chair 8.4
silhouette 8.3
life 8.2
danger 8.2
suit 8.1
center 8
water 8
interior 8
lifestyle 7.9
businessman 7.9
work 7.8
glass 7.8
disaster 7.8
nuclear 7.8
gas 7.7
industry 7.7
outdoor 7.6
light 7.5
environment 7.4
safety 7.4
vibraphone 7.3
happiness 7
modern 7

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

text 95.2
black and white 95
person 90
man 87.8
outdoor 86.6
clothing 82.3
black 75.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Male, 100%
Calm 96.3%
Sad 1.3%
Happy 0.7%
Confused 0.6%
Disgusted 0.4%
Surprised 0.3%
Fear 0.2%
Angry 0.1%

AWS Rekognition

Age 31-41
Gender Male, 99.9%
Calm 63.5%
Disgusted 13.1%
Sad 9%
Happy 4.8%
Confused 4.1%
Angry 2.5%
Fear 1.9%
Surprised 1.1%

AWS Rekognition

Age 41-49
Gender Male, 64.7%
Calm 92%
Sad 2.9%
Confused 1.3%
Fear 1%
Surprised 0.8%
Happy 0.8%
Angry 0.7%
Disgusted 0.4%

AWS Rekognition

Age 52-60
Gender Male, 99.5%
Calm 79.8%
Fear 4.8%
Disgusted 4.3%
Confused 3.7%
Sad 2.4%
Angry 2.4%
Surprised 1.5%
Happy 1.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.7%
Person 99.6%
Person 99.1%
Person 98.2%

Categories

Text analysis

Google

PCX
PCX