Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (men fixing farm damaged in tornado)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16694

Human Generated Data

Title

Untitled (men fixing farm damaged in tornado)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16694

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.7
Human 99.7
Shelter 98.5
Outdoors 98.5
Nature 98.5
Building 98.5
Countryside 98.5
Rural 98.5
Person 98
Person 97
Stage 89.4
Clothing 73.8
Apparel 73.8
Crowd 71.2
Housing 68.9
Face 66.9
Photography 63.5
Photo 63.5
Musician 60.1
Musical Instrument 60.1
Hut 58.1

Clarifai
created on 2023-10-29

people 99.1
monochrome 96.3
music 93.1
adult 90.9
street 90.6
vehicle 88.8
art 86.2
man 86.2
flame 84.6
many 82.7
the press 82
group together 81.8
war 80.1
woman 80.1
group 79.7
dust 78.5
family 75.6
administration 75.4
transportation system 75.1
aircraft 74.7

Imagga
created on 2022-02-26

shopping cart 40.6
wheeled vehicle 33.2
handcart 31.8
sky 21.9
building 16.6
container 15.4
city 15
architecture 14.6
vehicle 14.2
night 14.2
conveyance 13.9
structure 13.8
industry 13.7
equipment 12.9
construction 12.8
car 12.8
clouds 12.7
freight car 12.7
technology 12.6
wreckage 12.4
steel 12.4
travel 12
landscape 11.9
business 11.5
metal 11.3
part 11.1
light 10.7
water 10.7
urban 10.5
power 10.1
transportation 9.9
black 9.6
cloud 9.5
winter 9.4
dark 9.2
industrial 9.1
scene 8.7
device 8.4
old 8.4
mountains 8.3
transport 8.2
horizon 8.1
factory 7.9
roof 7.9
cockpit 7.8
cold 7.7
tree 7.7
dusk 7.6
skyline 7.6
house 7.5
park 7.5
ocean 7.5
silhouette 7.4
machine 7.4
work 7.2
sea 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 98.9
black and white 87.2
drawing 75.6
several 14.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 33-41
Gender Male, 99.3%
Calm 96.5%
Confused 2.1%
Surprised 0.3%
Happy 0.3%
Disgusted 0.2%
Sad 0.2%
Fear 0.2%
Angry 0.1%

Feature analysis

Amazon

Person
Person 99.7%

Captions

Microsoft
created on 2022-02-26

calendar 26.4%

Text analysis

Amazon

YT77AB
YT77AB can
can