Human Generated Data

Title

Untitled (New York City)

Date

1929

People

Artist: Paul Grotz, American 1902 - 1990

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Fund for the Acquisition of Photographs, P1998.49

Human Generated Data

Title

Untitled (New York City)

People

Artist: Paul Grotz, American 1902 - 1990

Date

1929

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Fund for the Acquisition of Photographs, P1998.49

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.7
Human 99.7
Person 98.7
Clothing 97.9
Apparel 97.9
Helmet 87.7
Handrail 85.2
Banister 85.2
Railing 74.3
Hat 67.6
Porch 66.1
Wood 62.9
Overcoat 62.2
Coat 62.2
Vehicle 58.7
Transportation 58.7
Cap 58.2
Spoke 55.4
Machine 55.4

Clarifai
created on 2023-10-25

people 99.9
adult 98.7
two 97.1
man 96.4
one 93.9
street 93.6
group 93.1
group together 92.7
woman 91.8
administration 90.6
veil 89.4
monochrome 88.2
child 87.6
three 87.3
leader 83.6
portrait 83.4
wear 82.6
furniture 82.5
boy 81.9
vehicle 81

Imagga
created on 2022-01-08

man 41
worker 31.9
male 31.2
person 30.8
work 29
engineer 25.7
people 24
industry 23.9
working 23.8
job 23
adult 21.2
men 20.6
helmet 20.2
industrial 18.1
equipment 17.7
occupation 17.4
safety 16.6
professional 15.4
black 14.4
business 14
construction 12.8
businessman 12.3
protection 11.8
labor 11.7
uniform 11.6
lifestyle 11.6
office 11.4
hat 11.3
building 11.3
portrait 11
machine 10.9
factory 10.8
indoors 10.5
repair 10.5
computer 10.5
one 10.4
device 10.1
happy 10
city 10
hardhat 9.8
profession 9.6
clothing 9.3
laptop 9.2
looking 8.8
sax 8.7
women 8.7
sitting 8.6
foreman 8.1
light 8.1
suit 8.1
success 8
active 8
urban 7.9
winter 7.7
hand 7.6
seller 7.5
site 7.5
businesswoman 7.3
musical instrument 7.3
forklift 7.3
tool 7.2
home 7.2

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

outdoor 99.1
text 97
clothing 94.8
person 90.1
man 85.9
black and white 62.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Female, 96.2%
Calm 97%
Surprised 1.9%
Fear 0.5%
Angry 0.2%
Sad 0.2%
Disgusted 0.1%
Happy 0.1%
Confused 0%

AWS Rekognition

Age 27-37
Gender Male, 97.9%
Calm 99.2%
Sad 0.2%
Fear 0.1%
Surprised 0.1%
Disgusted 0.1%
Confused 0.1%
Angry 0.1%
Happy 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Helmet 87.7%

Categories

Imagga

paintings art 85.7%
people portraits 8.9%
food drinks 4.9%

Text analysis

Amazon

'29
Reccl get '29
get
Reccl

Google

Recel Gob,
Recel
Gob,