Human Generated Data

Title

Untitled (woman, possibly mother, with two children on steps by waterside)

Date

1962

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16996

Human Generated Data

Title

Untitled (woman, possibly mother, with two children on steps by waterside)

People

Artist: Lucian and Mary Brown, American

Date

1962

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 100
Apparel 100
Person 98.2
Human 98.2
Person 96.3
Person 84
Fashion 81
Coat 79
Overcoat 78.1
Cloak 75.6
Hat 62.1
Toy 56.2

Imagga
created on 2022-02-26

man 24.9
mechanical device 22
person 20
protection 17.3
male 17
adult 16.8
water 16.7
mechanism 16.4
swing 16.3
safety 15.6
outdoors 15.4
industrial 15.4
people 15.1
smoke 14.9
dark 14.2
park 14
light 13.7
sprinkler 13
industry 12.8
mask 12.7
device 12.7
work 11.8
danger 11.8
structure 11.7
outdoor 11.5
tent 11.2
portrait 11
plaything 10.8
destruction 10.7
environment 10.7
travel 10.6
mountain tent 10.4
tree 10.2
river 9.8
forest 9.6
sky 9.6
sport 9.5
construction 9.4
job 8.8
accident 8.8
disaster 8.8
toxic 8.8
protective 8.8
nuclear 8.7
repair 8.6
canvas tent 8.3
lake 8.2
peaceful 8.2
fun 8.2
landscape 8.2
dirty 8.1
building 8
worker 8
stalker 7.9
business 7.9
radioactive 7.9
factory 7.8
radiation 7.8
standing 7.8
rock 7.8
season 7.8
labor 7.8
chemical 7.7
gas 7.7
shelter 7.7
adventure 7.6
fashion 7.5
enjoy 7.5
happy 7.5
clothing 7.4
occupation 7.3
metal 7.2
suit 7.2
activity 7.2
wet 7.2
smile 7.1
working 7.1
autumn 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

outdoor 97.8
black and white 95.1
text 93.7
clothing 87.9
monochrome 84.8
person 80.7
tree 78.4

Face analysis

Google

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.2%

Captions

Microsoft

a person that is standing in the rain 47.9%
a person is standing in the rain 47.8%
a person standing in the rain 47.7%

Text analysis

Amazon

ca