Human Generated Data

Title

Untitled (two photographs: men playing cards at table; cars parked in lot)

Date

c. 1945, printed later

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6786

Human Generated Data

Title

Untitled (two photographs: men playing cards at table; cars parked in lot)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1945, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6786

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.6
Person 99.6
Person 99.6
Poster 99.5
Advertisement 99.5
Collage 99.5
Person 99.4
Person 98.9
Person 98.7
Person 98.4
Person 96.9
Person 95.1
Nature 91.3
Person 89.5
Outdoors 89
Building 85.6
Person 84.3
Countryside 82.2
Person 80.9
Hut 75
Rural 75
Shack 73.6
Person 65.8
Person 65.6
Dugout 59.3
Prison 59.3
People 58.6
Person 46.9

Clarifai
created on 2019-11-16

people 99.6
group 99.3
many 98.7
group together 98.4
man 97.2
adult 97.2
woman 94
room 91.6
child 89
crowd 88.6
several 88.3
movie 88.1
furniture 85
war 82.7
wear 82.5
television 82.5
indoors 81.6
monochrome 81
education 80.7
audience 79.9

Imagga
created on 2019-11-16

barbershop 25.5
shop 22.1
window 21.1
old 19.5
building 18.9
city 18.3
structure 17.3
vehicle 16.9
billboard 16.8
wheeled vehicle 16.8
mercantile establishment 16.8
conveyance 16.7
street 16.6
snow 15.6
architecture 15.6
transportation 15.2
travel 14.8
tramway 14
black 13.8
winter 13.6
car 13.3
train 12.5
signboard 12
transport 11.9
streetcar 11.4
urban 11.4
door 11.3
cold 11.2
place of business 11.2
wall 11.1
house 10.9
public 10.7
people 10
passenger car 9.6
silhouette 9.1
road 9
station 8.8
man 8.7
lamp 8.7
light 8.7
glass 8.6
dirty 8.1
scene 7.8
windows 7.7
sky 7.6
decoration 7.6
sign 7.5
passenger 7.4
equipment 7.4
track 7.3
tourist 7.2
history 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 95.5
clothing 94.9
person 87.3
indoor 85.6
man 74.6
gallery 67.7
people 62.8
old 51
room 44.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 50-68
Gender Male, 50.4%
Sad 50%
Angry 49.5%
Calm 49.8%
Happy 49.6%
Surprised 49.5%
Confused 49.5%
Disgusted 49.5%
Fear 49.5%

AWS Rekognition

Age 50-68
Gender Male, 50.5%
Calm 49.8%
Confused 49.5%
Happy 49.9%
Disgusted 49.5%
Surprised 49.5%
Sad 49.6%
Fear 49.5%
Angry 49.6%

AWS Rekognition

Age 20-32
Gender Male, 50.1%
Sad 50%
Surprised 49.5%
Angry 49.5%
Confused 49.5%
Fear 49.9%
Happy 49.5%
Calm 49.6%
Disgusted 49.5%

AWS Rekognition

Age 12-22
Gender Male, 50.2%
Sad 50.1%
Fear 49.5%
Disgusted 49.5%
Angry 49.5%
Happy 49.6%
Confused 49.5%
Calm 49.8%
Surprised 49.5%

AWS Rekognition

Age 26-40
Gender Female, 50.1%
Disgusted 49.5%
Sad 50.3%
Confused 49.5%
Happy 49.5%
Fear 49.5%
Surprised 49.5%
Calm 49.6%
Angry 49.5%

AWS Rekognition

Age 37-55
Gender Male, 50.1%
Disgusted 49.5%
Fear 49.6%
Calm 49.7%
Surprised 49.5%
Angry 49.6%
Confused 49.5%
Happy 49.9%
Sad 49.6%

AWS Rekognition

Age 20-32
Gender Male, 50.3%
Confused 49.5%
Calm 49.5%
Surprised 49.5%
Fear 49.5%
Happy 49.5%
Angry 49.5%
Sad 50.4%
Disgusted 49.5%

AWS Rekognition

Age 32-48
Gender Male, 50.5%
Happy 49.5%
Disgusted 49.5%
Sad 50.2%
Angry 49.5%
Surprised 49.5%
Fear 49.5%
Calm 49.8%
Confused 49.5%

Feature analysis

Amazon

Person
Person 99.6%
Person 99.6%
Person 99.4%
Person 98.9%
Person 98.7%
Person 98.4%
Person 96.9%
Person 95.1%
Person 89.5%
Person 84.3%
Person 80.9%
Person 65.8%
Person 65.6%
Person 46.9%

Categories

Imagga

paintings art 52.9%
food drinks 44.9%

Captions

Google Gemini

Created by gemini-2.0-flash on 2025-05-11

The image is a black and white composite containing four separate images.

In the top left image, a group of men and one woman are gathered around a table in what appears to be a communal building. Many are wearing light-colored shirts and hats. One man has the word "Superior" printed on the back of his shirt.

The top right part of the composite is blank and just shows the background.

In the bottom left image, there is a long, one-story building with numerous parked cars in front of it. The surrounding landscape is grassy and open.

The bottom right image shows a similar group of people around a table as in the top left image, but with a slightly different composition. The same man with "Superior" on his shirt is visible, as are many of the same individuals. The setting appears to be the same communal building.

Created by gemini-2.0-flash-lite on 2025-05-11

The image shows a collection of four black and white photographs. The photographs seem to be of people, likely from the mid-20th century, possibly related to a business or organization.

  • Top left: Shows a group of men standing and sitting around a table. Most of the men are wearing hats and white shirts. The words "Superior" is visible on one of the men's shirt.
  • Top right: Shows a group of men, most of them are wearing hats and white shirts, sitting around a table.
  • Bottom left: Shows a building with parked cars in front, likely in a rural area.
  • Bottom right: Shows another view of the men at a table.

The overall composition suggests a photographic record or a series of images from a specific event or time, possibly a gathering or work-related activity.

Text analysis

Amazon

Smpscige
Srpeion

Google

Stupecior Spscier
Stupecior
Spscier