Human Generated Data

Title

Untitled (spectators watching house fire)

Date

May 14, 1950

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18049

Human Generated Data

Title

Untitled (spectators watching house fire)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

May 14, 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 99.8
Human 99.8
Person 99.7
Person 99.6
Person 99.5
Person 99.5
Person 99.4
Person 98.9
Pedestrian 98.2
Street 98
Urban 98
Town 98
City 98
Road 98
Building 98
Clothing 93.9
Apparel 93.9
Path 93.8
Asphalt 91.3
Tarmac 91.3
Walkway 84
Overcoat 79.3
Coat 79.3
Alley 77.6
Alleyway 77.6
Suit 76.5
Nature 73.6
People 68.7
Sidewalk 66.7
Pavement 66.7
Outdoors 64.3
Silhouette 58.4
Walking 57.5

Imagga
created on 2022-03-04

tunnel 42.8
passageway 29.8
passage 25.1
fountain 20.3
dark 20
structure 19.1
sidewalk 18.5
night 17.8
old 17.4
way 17.2
light 16
building 15.8
city 15.8
man 15.5
tree 15.4
travel 14.8
street 14.7
stone 14.3
architecture 14.2
people 13.9
wall 12.8
outdoor 12.2
holiday 12.2
water 12
park 11.6
tourism 10.7
adult 10.6
history 9.8
rock 9.6
tourist 9.3
historic 9.2
urban 8.7
scene 8.7
black 8.5
walking 8.5
winter 8.5
monument 8.4
outdoors 8.4
town 8.3
sky 8.3
silhouette 8.3
landscape 8.2
lamp 8.2
road 8.1
shadow 8.1
sun 8.1
person 7.9
darkness 7.8
season 7.8
entrance 7.7
snow 7.5
lights 7.4
vacation 7.4
wet 7.2
river 7.1
male 7.1
scenic 7

Google
created on 2022-03-04

Black 89.6
Black-and-white 86
Tree 84.7
Style 83.8
Line 82.5
Adaptation 79.5
Monochrome photography 77.4
Monochrome 75.4
Plant 73.8
Road surface 71.4
Font 70.5
Event 69.3
Road 68.2
Vintage clothing 67.2
Darkness 66.5
Art 66.3
City 63.7
Asphalt 63.3
Room 62.7
Pedestrian 62.6

Microsoft
created on 2022-03-04

outdoor 98
text 95.6
black and white 95
person 84.4
street 83.5
clothing 79.1
way 72.3
monochrome 71.9
tree 53.6
sidewalk 28.4

Face analysis

Google

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a group of people walking down a street 80.6%
a group of people walking in front of a building 79%
a group of people walking down the street 78.9%

Text analysis

Amazon

SCI

Google

SEI
SEI