Human Generated Data

Title

Untitled (group standing outside building)

Date

1944

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2075

Human Generated Data

Title

Untitled (group standing outside building)

People

Artist: Hamblin Studio, American active 1930s

Date

1944

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2075

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Nature 99.8
Blizzard 99.6
Outdoors 99.6
Storm 99.6
Snow 99.6
Winter 99.6
Person 99.2
Human 99.2
Wheel 98.7
Machine 98.7
Person 98.7
Person 98.2
Person 97.9
Person 97
Person 96.4
Person 94.2
Person 93.6
Car 89.9
Transportation 89.9
Vehicle 89.9
Automobile 89.9
Weather 75.9
Person 61.5
Neighborhood 57.5
Urban 57.5
Building 57.5
Ice 57.5

Clarifai
created on 2023-10-15

house 99.3
building 98.8
street 98.3
monochrome 97.9
architecture 97.4
town 97.3
people 96
winter 95.7
vintage 94
family 93.4
snow 93.2
cold 91.8
old 91.5
wood 90.1
home 90
vehicle 90
black and white 89.4
roof 89.3
door 88.2
city 88.2

Imagga
created on 2021-12-14

sketch 71.7
drawing 57.1
snow 43.1
representation 40.8
architecture 38.1
building 31.8
city 26.6
house 25.4
structure 22.3
winter 22.1
sky 20.4
weather 20.4
construction 19.7
old 19.5
urban 17.5
cold 15.5
landscape 13.4
tree 13.1
town 13
garage 12.7
history 12.5
wall 12
exterior 12
travel 12
window 11.9
trees 10.7
scene 10.4
stone 10.2
facade 10
vintage 9.9
river 9.8
new 9.7
home 9.6
fence 9.5
roof 9.5
negative 9.4
street 9.2
landmark 9
ice 8.8
film 8.8
light 8.7
ancient 8.7
culture 8.6
business 8.5
design 8.4
modern 8.4
outdoor 8.4
frame 8.3
tower 8.1
night 8
blueprint 7.8
season 7.8
district 7.8
industry 7.7
grunge 7.7
plan 7.6
outdoors 7.5
church 7.4
industrial 7.3
day 7.1

Google
created on 2021-12-14

Building 94.9
Window 94
House 85.5
Door 83.8
Art 80.7
Tints and shades 75.9
Rectangle 75
Siding 74.9
Monochrome 74.9
Cottage 74.4
Vehicle 74.3
Facade 72.8
Tree 71.6
Car 68.9
Wheel 68.8
Painting 68.4
Illustration 67.6
Motor vehicle 67.2
Room 66.7
Drawing 66.2

Microsoft
created on 2021-12-14

outdoor 99.5
text 96.1
road 96.1
vehicle 85
house 76.7
black 76
land vehicle 69.8
white 69.6
black and white 61.1
car 55.8
old 45.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 18-30
Gender Male, 89.4%
Happy 49.2%
Calm 23.5%
Surprised 12.4%
Sad 7.9%
Confused 4.1%
Angry 1.5%
Fear 0.7%
Disgusted 0.7%

AWS Rekognition

Age 36-54
Gender Male, 64.1%
Calm 48.8%
Happy 42.4%
Sad 4.6%
Disgusted 2.3%
Angry 0.8%
Fear 0.5%
Surprised 0.3%
Confused 0.3%

AWS Rekognition

Age 49-67
Gender Male, 66.7%
Calm 82.7%
Sad 7.6%
Happy 5.1%
Angry 2.3%
Fear 1%
Confused 0.7%
Surprised 0.3%
Disgusted 0.2%

AWS Rekognition

Age 8-18
Gender Male, 60.3%
Calm 51.3%
Sad 28.7%
Confused 12.1%
Happy 4.2%
Angry 1.8%
Surprised 1.2%
Disgusted 0.5%
Fear 0.3%

Feature analysis

Amazon

Person 99.2%
Wheel 98.7%
Car 89.9%

Categories

Imagga

paintings art 98.9%

Text analysis

Amazon

MJI3
MJI3 YT77A2 АЗАА
YT77A2
АЗАА