Human Generated Data

Title

Untitled (float in parade)

Date

1941

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2061

Human Generated Data

Title

Untitled (float in parade)

People

Artist: Hamblin Studio, American active 1930s

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2061

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 93.8
Human 93.8
Person 88.8
Person 83.3
Furniture 80
Person 79.7
Vehicle 78.2
Transportation 78.2
Person 74.1
Person 72.9
Chair 71.6
Face 70
Crowd 68.6
Person 66.6
People 64.9
Person 61.1
Sailor Suit 57.8
Theme Park 56.2
Amusement Park 56.2
Person 42.9

Clarifai
created on 2023-10-25

people 99.4
monochrome 97
many 96.7
adult 95.6
chair 95.5
man 95.2
group 90.8
group together 88.9
vehicle 88.2
crowd 86.6
no person 86.2
administration 85.6
furniture 82.6
seat 82.6
transportation system 81.7
watercraft 81.5
woman 79.3
street 76
war 75.5
child 75.1

Imagga
created on 2021-12-14

blackboard 28.4
home appliance 21.5
appliance 15.7
toaster 15.6
device 14.3
water 13.3
dishwasher 12.8
kitchen appliance 12.5
white goods 12.2
business 12.1
travel 12
grunge 11.9
snow 11
technology 10.4
pattern 10.2
finance 10.1
dirty 9.9
old 9.7
black 9.6
sky 9.6
cold 9.5
equipment 9.3
vintage 9.2
window 9.2
office 9
architecture 8.6
money 8.5
house 8.4
tourism 8.2
building 8.2
computer 8.1
machine 7.9
season 7.8
durables 7.7
winter 7.7
vessel 7.6
city 7.5
closeup 7.4
ice 7.4
bank 7.4
design 7.3
tourist 7.2
paint 7.2
currency 7.2
transportation 7.2

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 96.8
old 68.5
black and white 56.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 18-30
Gender Female, 55.4%
Calm 74.3%
Disgusted 11.2%
Sad 5.6%
Angry 5.6%
Happy 1.7%
Confused 0.7%
Fear 0.5%
Surprised 0.4%

AWS Rekognition

Age 42-60
Gender Male, 63.7%
Happy 93.8%
Calm 4.6%
Surprised 0.6%
Confused 0.3%
Angry 0.2%
Sad 0.2%
Fear 0.2%
Disgusted 0.1%

AWS Rekognition

Age 13-25
Gender Female, 66.5%
Calm 50%
Happy 30.4%
Surprised 10.1%
Sad 3.1%
Fear 2.6%
Confused 1.7%
Angry 1.4%
Disgusted 0.7%

AWS Rekognition

Age 49-67
Gender Female, 53%
Happy 59.8%
Calm 21.8%
Angry 8.3%
Sad 5.1%
Fear 2.6%
Disgusted 0.9%
Confused 0.8%
Surprised 0.7%

AWS Rekognition

Age 56-74
Gender Male, 83%
Calm 75%
Happy 8.8%
Angry 5.4%
Sad 4.8%
Disgusted 2.9%
Confused 2.3%
Surprised 0.4%
Fear 0.4%

AWS Rekognition

Age 6-16
Gender Male, 56.1%
Calm 64.9%
Happy 16.3%
Sad 14.7%
Angry 1.1%
Fear 0.9%
Disgusted 0.8%
Surprised 0.7%
Confused 0.6%

Feature analysis

Amazon

Person 93.8%

Categories

Imagga

paintings art 99.2%

Text analysis

Amazon

NATIONAL
SUFFOLK
BANK
OF
CAFE
SUFFOLK FOR PEANUTS
FOR PEANUTS
M
VT7342
over
over للو
للو
VT7342 and
and
ozen

Google

SUFTOLK
SMEFOLK
EOR
NATIONAL SUFTOLK SMEFOLK EOR PEANUTS
NATIONAL
PEANUTS