Human Generated Data

Title

Untitled (Bob-O the clown with circus performer Carmen Monton and small dog)

Date

1961

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11522

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Bob-O the clown with circus performer Carmen Monton and small dog)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1961

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11522

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.4
Human 99.4
Clothing 90.1
Apparel 90.1
Shoe 72.8
Footwear 72.8
Person 68.4
Outdoors 63.9
Face 63.2
Leisure Activities 60.9
Photography 60.1
Photo 60.1
Shorts 58.7
Pants 57.3

Clarifai
created on 2023-10-25

people 99.1
monochrome 96.7
wear 95.5
man 93.1
vehicle 92.7
aircraft 91.2
two 91.1
adult 91.1
group together 90.2
one 87.8
outfit 87.8
robot 87.7
three 86.7
transportation system 85.4
retro 83.2
recreation 80.9
military 80.7
war 80.1
movie 79.7
music 78.5

Imagga
created on 2022-01-15

device 20
metal 15.3
city 14.9
transportation 13.4
transport 12.8
vehicle 11.9
building 11.9
brass 11.1
support 11.1
architecture 10.9
gun 10.7
travel 10.5
modern 10.5
statue 10.2
man 10.1
wheel 10
industry 9.4
speed 9.1
cannon 9.1
old 9
sculpture 9
work 8.9
technology 8.9
wind instrument 8.8
urban 8.7
structure 8.7
art 8.6
outside 8.5
equipment 8.5
sport 8.4
line 8.4
sky 8.3
rope 8
cycle 7.8
male 7.8
weapon 7.8
color 7.8
turbine 7.8
car 7.7
auto 7.6
power 7.5
dark 7.5
monument 7.5
light 7.3
digital 7.3
machine 7.3
park 7.3
industrial 7.3
people 7.2
landmark 7.2
black 7.2
steel 7.2
spoke 7.1
adult 7.1

Microsoft
created on 2022-01-15

outdoor 97.6
text 97.6
black and white 86.8
ship 68.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 29-39
Gender Female, 95.7%
Calm 44.1%
Happy 43.5%
Sad 3.6%
Angry 3%
Surprised 1.7%
Disgusted 1.7%
Fear 1.4%
Confused 1%

Feature analysis

Amazon

Person 99.4%
Shoe 72.8%

Categories

Captions

Microsoft
created on 2022-01-15

a group of people riding on the back of a motorcycle 27%

Text analysis

Amazon

TNT
47252
MJI7--YT 47252
MJI7--YT
NOT

Google

MJI7--YT NOT TNT
MJI7--YT
NOT
TNT