Human Generated Data

Title

Untitled (Blackwell's Island Penitentiary, Welfare Island, New York City)

Date

May 1934-June 1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, by exchange, P2000.43

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Blackwell's Island Penitentiary, Welfare Island, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

May 1934-June 1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, by exchange, P2000.43

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2022-06-11

Audience 99.9
Crowd 99.9
Human 99.9
Person 99.5
Person 99
Person 98.9
Person 98.8
Person 98.5
Person 98.2
Person 98.1
Person 97.7
Person 97.6
Person 96.9
Person 96.7
Person 87.1
Person 85
People 83
Sunglasses 72.9
Accessories 72.9
Accessory 72.9
Person 67.9
Person 66.4
Speech 65
Person 63.1
Lecture 59.6
Text 58.5
Person 45.5

Clarifai
created on 2023-10-29

people 100
many 99.9
group 99.7
crowd 98.4
group together 98.3
adult 98.1
man 97.9
administration 96.4
woman 96.2
child 94.9
leader 94.3
war 92.2
spectator 92
military 89.6
audience 87.6
soldier 82.2
education 81
queue 79.2
recreation 78.3
school 75.1

Imagga
created on 2022-06-11

person 40.3
man 36.9
people 33.4
group 32.2
male 30.5
businessman 29.1
business 26.7
fan 26.1
men 23.2
spectator 21.9
meeting 21.7
team 21.5
office 21
classroom 20.4
follower 20.3
student 19
businesswoman 17.3
room 17.1
executive 17
colleagues 16.5
adult 16.5
corporate 16.3
businesspeople 15.2
table 14.7
happy 14.4
work 14.1
women 13.4
job 13.3
professional 13
lifestyle 13
education 13
teamwork 13
teacher 12.9
smile 12.8
speaker 12.4
smiling 12.3
coworkers 11.8
desk 11.3
40s 10.7
working 10.6
couple 10.4
sitting 10.3
school 10.2
laptop 10
discussion 9.7
portrait 9.7
indoors 9.7
together 9.6
boy 9.6
entrepreneur 9.5
articulator 9.3
manager 9.3
presentation 9.3
mature 9.3
communication 9.2
indoor 9.1
old 9.1
black 9
associates 8.8
life 8.8
crowd 8.6
staff 8.6
talking 8.5
senior 8.4
20s 8.2
successful 8.2
cheerful 8.1
suit 8.1
success 8
looking 8
boardroom 7.9
discussing 7.9
standing 7.8
conference 7.8
businessmen 7.8
mid adult 7.7
busy 7.7
class 7.7
30s 7.7
four 7.7
career 7.6
friends 7.5
secretary 7.4
camera 7.4
color 7.2
worker 7.1
day 7.1
happiness 7
travel 7
architecture 7
modern 7

Google
created on 2022-06-11

Photograph 94.2
Human 89.2
Black-and-white 84.5
Style 83.9
Crowd 82.8
Adaptation 79.4
Font 79.3
Monochrome photography 76.3
Monochrome 74.9
Snapshot 74.3
Event 73.4
Team 69.6
Tree 69.4
Crew 68.5
Room 68
History 67.5
Stock photography 64.5
Suit 64.1
Vintage clothing 64.1
Art 63.9

Microsoft
created on 2022-06-11

text 99
clothing 98.8
person 98.4
human face 95.7
man 93.6
woman 82.4
smile 79.8
people 73.7
crowd 63.9
group 63.1
black and white 50.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

Imagga

AWS Rekognition

Age 36-44
Gender Male, 100%
Calm 79.4%
Happy 9.5%
Surprised 7.6%
Fear 6.3%
Angry 2.9%
Sad 2.5%
Confused 2.5%
Disgusted 1%

AWS Rekognition

Age 48-56
Gender Male, 99.8%
Calm 99.2%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0.4%
Angry 0.1%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 38-46
Gender Male, 99.9%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 1.1%
Confused 0.9%
Disgusted 0.7%
Angry 0.3%
Happy 0.1%

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Calm 73.1%
Sad 36.6%
Surprised 6.3%
Fear 5.9%
Confused 3.2%
Angry 0.2%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 23-33
Gender Male, 99.9%
Surprised 55.1%
Confused 52.6%
Calm 9.3%
Fear 5.9%
Sad 2.3%
Angry 2.1%
Disgusted 1.1%
Happy 0.3%

AWS Rekognition

Age 23-31
Gender Male, 97.8%
Calm 98.8%
Surprised 6.3%
Fear 5.9%
Sad 2.4%
Confused 0.1%
Disgusted 0%
Angry 0%
Happy 0%

AWS Rekognition

Age 36-44
Gender Male, 99.9%
Confused 81.5%
Calm 9.5%
Surprised 6.6%
Fear 6%
Sad 3.7%
Angry 3.6%
Disgusted 0.3%
Happy 0.2%

AWS Rekognition

Age 30-40
Gender Male, 100%
Confused 75.9%
Calm 21.2%
Surprised 6.5%
Fear 5.9%
Sad 2.7%
Angry 0.4%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 18-26
Gender Male, 92.5%
Sad 81.1%
Calm 58.6%
Fear 6.5%
Surprised 6.3%
Confused 1%
Angry 0.9%
Happy 0.6%
Disgusted 0.2%

AWS Rekognition

Age 20-28
Gender Male, 99.7%
Calm 96.5%
Surprised 6.7%
Fear 5.9%
Sad 2.2%
Disgusted 1.2%
Happy 0.6%
Confused 0.4%
Angry 0.3%

AWS Rekognition

Age 37-45
Gender Male, 99.6%
Calm 87.1%
Surprised 6.5%
Fear 6.3%
Angry 5.8%
Confused 3.7%
Sad 2.6%
Disgusted 0.3%
Happy 0.2%

AWS Rekognition

Age 1-7
Gender Female, 59.7%
Happy 84.3%
Calm 9.5%
Surprised 6.5%
Fear 6.2%
Sad 3.6%
Angry 0.7%
Disgusted 0.4%
Confused 0.2%

AWS Rekognition

Age 38-46
Gender Male, 99.1%
Calm 97.7%
Surprised 6.3%
Fear 5.9%
Sad 2.4%
Happy 0.5%
Angry 0.3%
Confused 0.2%
Disgusted 0.2%

AWS Rekognition

Age 27-37
Gender Male, 100%
Sad 100%
Surprised 6.3%
Fear 5.9%
Angry 3.2%
Calm 0.6%
Confused 0.1%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 20-28
Gender Male, 59.5%
Confused 51.6%
Calm 33.8%
Surprised 7.5%
Fear 7%
Sad 4.5%
Angry 1.5%
Disgusted 1.3%
Happy 1.2%

AWS Rekognition

Age 26-36
Gender Male, 99.8%
Sad 97.3%
Confused 14.4%
Happy 13.4%
Calm 13.1%
Surprised 6.9%
Fear 6.1%
Disgusted 3.2%
Angry 2.2%

AWS Rekognition

Age 42-50
Gender Male, 99.8%
Sad 89.2%
Calm 32%
Angry 10.3%
Surprised 7.5%
Fear 6.4%
Confused 5.9%
Happy 4.2%
Disgusted 2.4%

AWS Rekognition

Age 37-45
Gender Male, 99.8%
Fear 69.7%
Sad 64.6%
Surprised 7%
Calm 5.7%
Angry 4.3%
Happy 1.8%
Confused 1.5%
Disgusted 0.7%

AWS Rekognition

Age 19-27
Gender Female, 60.5%
Happy 85.2%
Surprised 7.7%
Fear 6.1%
Calm 5%
Disgusted 2.8%
Sad 2.6%
Angry 1.5%
Confused 0.9%

AWS Rekognition

Age 23-33
Gender Male, 67.4%
Calm 89.9%
Angry 8.8%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Disgusted 0.6%
Happy 0.2%
Confused 0.1%

AWS Rekognition

Age 21-29
Gender Male, 99.3%
Calm 32%
Disgusted 20.7%
Confused 18%
Happy 15.5%
Surprised 9.1%
Fear 6.6%
Sad 4.5%
Angry 2%

AWS Rekognition

Age 31-41
Gender Male, 99.8%
Sad 97.4%
Disgusted 19.6%
Happy 9.9%
Angry 9.2%
Surprised 7.2%
Fear 6.5%
Confused 3.2%
Calm 2.4%

AWS Rekognition

Age 23-33
Gender Male, 99.5%
Sad 99.8%
Happy 12%
Surprised 10.2%
Fear 8.4%
Disgusted 2.1%
Angry 1.8%
Calm 1.6%
Confused 1.5%

AWS Rekognition

Age 25-35
Gender Female, 97.4%
Calm 24.6%
Disgusted 23.4%
Sad 17.8%
Surprised 13.8%
Confused 10.1%
Fear 9.9%
Happy 3.9%
Angry 3.3%

AWS Rekognition

Age 11-19
Gender Male, 99.9%
Angry 51.3%
Sad 25.2%
Fear 10.2%
Calm 9.6%
Surprised 8.1%
Disgusted 4.7%
Confused 1.5%
Happy 1.5%

AWS Rekognition

Age 18-24
Gender Female, 95%
Surprised 47%
Fear 18.6%
Sad 17.7%
Disgusted 10.7%
Calm 8.6%
Angry 7.9%
Happy 6%
Confused 1.5%

AWS Rekognition

Age 23-31
Gender Male, 81.8%
Calm 43%
Fear 38.7%
Disgusted 8.9%
Surprised 7.6%
Angry 5.9%
Sad 2.6%
Confused 2.1%
Happy 1.8%

AWS Rekognition

Age 14-22
Gender Male, 81.2%
Calm 97.8%
Surprised 6.3%
Fear 5.9%
Sad 2.7%
Happy 0.1%
Disgusted 0.1%
Angry 0.1%
Confused 0.1%

AWS Rekognition

Age 23-33
Gender Male, 95.5%
Calm 66.3%
Sad 57.3%
Fear 7%
Surprised 6.4%
Happy 0.8%
Disgusted 0.5%
Confused 0.4%
Angry 0.2%

AWS Rekognition

Age 21-29
Gender Male, 81.4%
Calm 95.3%
Surprised 6.4%
Fear 6%
Sad 2.7%
Happy 1.5%
Angry 0.4%
Disgusted 0.3%
Confused 0.2%

AWS Rekognition

Age 27-37
Gender Male, 96%
Happy 65.6%
Surprised 13.3%
Fear 6.4%
Calm 6.4%
Sad 5.8%
Confused 4.7%
Disgusted 4.2%
Angry 0.8%

AWS Rekognition

Age 35-43
Gender Male, 93.2%
Sad 98.4%
Calm 27.4%
Happy 9.3%
Surprised 6.8%
Fear 6.3%
Angry 2.2%
Confused 2.2%
Disgusted 1.4%

AWS Rekognition

Age 11-19
Gender Female, 56.7%
Calm 87%
Fear 7.1%
Surprised 6.5%
Happy 5.1%
Sad 2.6%
Angry 2.2%
Disgusted 0.4%
Confused 0.4%

AWS Rekognition

Age 19-27
Gender Female, 52.6%
Calm 46.9%
Sad 40.2%
Disgusted 19.7%
Surprised 6.5%
Fear 6.4%
Happy 5.4%
Confused 1.2%
Angry 0.8%

AWS Rekognition

Age 16-24
Gender Male, 88.1%
Calm 87.8%
Sad 8.6%
Surprised 6.3%
Fear 5.9%
Angry 0.7%
Happy 0.4%
Confused 0.4%
Disgusted 0.2%

AWS Rekognition

Age 19-27
Gender Male, 91.5%
Calm 53.2%
Sad 22.5%
Confused 21.5%
Fear 6.6%
Surprised 6.6%
Happy 2%
Disgusted 1.7%
Angry 0.9%

AWS Rekognition

Age 35-43
Gender Male, 99.9%
Calm 99%
Surprised 6.3%
Fear 5.9%
Sad 2.3%
Happy 0.1%
Confused 0.1%
Angry 0.1%
Disgusted 0%

AWS Rekognition

Age 14-22
Gender Male, 98.1%
Calm 24.3%
Angry 20.1%
Happy 18.4%
Sad 18.1%
Disgusted 10.4%
Surprised 9.7%
Fear 6.5%
Confused 3.5%

AWS Rekognition

Age 19-27
Gender Male, 98.2%
Calm 81.6%
Surprised 8.5%
Fear 6.6%
Sad 3.6%
Happy 3.4%
Disgusted 2.4%
Confused 1.6%
Angry 1.4%

AWS Rekognition

Age 20-28
Gender Female, 82.5%
Calm 77.1%
Surprised 7%
Disgusted 6.9%
Fear 6.8%
Angry 4.8%
Sad 4.4%
Confused 1.2%
Happy 1.1%

AWS Rekognition

Age 16-24
Gender Male, 91.5%
Calm 78.9%
Sad 16.9%
Surprised 6.6%
Fear 6%
Happy 2.3%
Confused 1%
Disgusted 0.7%
Angry 0.4%

AWS Rekognition

Age 13-21
Gender Male, 96.8%
Calm 79.2%
Fear 8.3%
Surprised 6.7%
Disgusted 4.2%
Sad 3.6%
Happy 2.3%
Confused 2.1%
Angry 1.9%

Microsoft Cognitive Services

Age 29
Gender Male

Microsoft Cognitive Services

Age 39
Gender Male

Microsoft Cognitive Services

Age 55
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Imagga

Traits no traits identified

Imagga

Traits no traits identified

Imagga

Traits no traits identified

Imagga

Traits no traits identified

Imagga

Traits no traits identified

Imagga

Traits no traits identified

Imagga

Traits no traits identified

Imagga

Traits no traits identified

Imagga

Traits no traits identified

Imagga

Traits no traits identified

Imagga

Traits no traits identified

Imagga

Traits no traits identified

Imagga

Traits no traits identified

Imagga

Traits no traits identified

Imagga

Traits no traits identified

Feature analysis

Amazon

Person
Sunglasses
Person 99.5%
Person 99%
Person 98.9%
Person 98.8%
Person 98.5%
Person 98.2%
Person 98.1%
Person 97.7%
Person 97.6%
Person 96.9%
Person 96.7%
Person 87.1%
Person 85%
Person 67.9%
Person 66.4%
Person 63.1%
Person 45.5%
Sunglasses 72.9%

Captions

Clarifai
created by general-english-image-caption-blip on 2025-05-06

a photograph of a group of people standing in front of a crowd -100%

OpenAI GPT

Created by gpt-4o-2024-11-20 on 2025-06-12

The image depicts a large outdoor gathering of people, primarily men, assembled in close proximity. Many of them appear to be wearing casual shirts or light clothing suited for warm weather. The setting includes a background with trees and a fenced area, suggesting a public space such as a park or institutional grounds. The crowd is clustered together in a way that suggests a meeting, gathering, or event. The overall tone of the image is black-and-white.

Created by gpt-4o-2024-08-06 on 2025-06-12

The image displays a large group of people gathered outdoors. The individuals are wearing a variety of casual clothing, predominantly light shirts, and many appear to be engaged in conversation or observing something of interest. The scene suggests a social or communal event, possibly in a park or an outdoor area, as indicated by the presence of trees and open space in the background. The atmosphere seems lively due to the dense gathering of people.

Meta Llama

Created by us.meta.llama3-2-11b-instruct-v1:0 on 2025-06-04

The image depicts a black-and-white photograph of a large group of men standing in a crowd, with some individuals wearing hats and others dressed in casual attire such as t-shirts and button-down shirts.

In the foreground, several men are visible, with their faces blurred or not clearly defined. The background of the image shows a field or open area with trees and a fence in the distance. The overall atmosphere of the image suggests a gathering or event of some kind, possibly a protest or rally, given the large number of people present.

The image appears to be an archival photograph, as evidenced by the handwritten notation "TL36231.6" at the bottom of the image, which may indicate the photograph's catalog number or identifier.

Created by us.meta.llama3-2-90b-instruct-v1:0 on 2025-06-04

This black-and-white photograph captures a large group of men gathered in an outdoor setting, with a prominent figure in the foreground and a blurred face. The men are dressed in casual attire, including white shirts and hats, and appear to be engaged in a discussion or meeting. In the background, trees and a fence are visible, suggesting a rural or open-air location. The overall atmosphere of the image is one of informality and camaraderie, with the men seemingly relaxed and engaged with one another.

Amazon Nova

Created by amazon.nova-lite-v1:0 on 2025-06-03

The image is a black-and-white photograph of a crowd of people, possibly gathered for a specific event or purpose. The crowd appears to be predominantly male, and they are dressed in simple, plain clothing, suggesting a working-class or rural background. The individuals in the crowd are standing in a loose formation, with some appearing to be engaged in conversation or interaction with one another.

The photograph has a somewhat grainy and aged appearance, indicating that it may be an old or historical image. The background of the image is somewhat blurry, with a few trees and a distant horizon visible. There is a watermark in the bottom left corner of the image, which reads "TL 36231.6," suggesting that it may be a stock photo or an image from a historical archive.

Overall, the image conveys a sense of community and shared purpose, with the individuals in the crowd appearing to be united by a common goal or interest. The plain clothing and simple setting suggest a time period prior to the widespread availability of modern technology and consumer goods, possibly in the early to mid-20th century.

Created by amazon.nova-pro-v1:0 on 2025-06-03

The image is a black-and-white photograph of a large crowd of people, possibly taken in the early 20th century. The people are gathered in an outdoor area, possibly a park or a public square. They are dressed in casual clothing, with some wearing hats and glasses. The crowd appears to be diverse, with people of different ages, genders, and ethnicities. The image has a vintage look, with a slightly grainy texture and faded colors.

Google Gemini

Created by gemini-2.0-flash on 2025-05-07

Here's a description of the image:

The black and white photo shows a large group of men standing outdoors, densely packed together. The men are of varying ethnicities and appear to be wearing work clothes or casual attire, mostly light-colored shirts. Some are wearing hats or caps. The foreground is filled with the backs of several men, with others visible behind them in layers stretching towards the back of the frame. In the distance, there are trees and what appears to be an open area. The overall mood seems to be one of anticipation or waiting. The photograph is marked at the bottom with the text "TL 36231.6".

Created by gemini-2.0-flash-lite on 2025-05-07

Here's a description of the image:

Overall Impression: The image is a black and white photograph of a large crowd of men, likely outdoors. The photograph has a documentary feel to it.

Composition: The shot is taken from a slightly low angle, looking up at the crowd. The men are tightly packed, filling the entire frame. The main focus is on the men in the foreground, with a large number of figures receding into the background.

People: The men are varied in appearance. The majority seem to be wearing light-colored shirts, some with hats, and a few are shirtless. The skin tones suggest a mix of ethnicities. Their expressions are difficult to read, but some appear to be looking toward something off-camera.

Background: The background shows more of the crowd, along with a glimpse of some trees and what might be a building in the distance. The overall atmosphere is slightly hazy.

Overall: It's a photograph that could depict a meeting, a protest, or a gathering of some kind.

Mistral Pixtral

Created by us.mistral.pixtral-large-2502-v1:0 on 2025-06-03

The image depicts a large group of people gathered outdoors, likely for a significant event or protest. The crowd is diverse, including individuals of different ages and ethnicities. Many of the people are dressed in light-colored clothing, possibly due to warm weather. The expressions on their faces vary, with some appearing focused or concerned. The background shows a few trees and an open sky, suggesting the event is taking place in an open area. The overall mood of the image conveys a sense of unity and purpose among the crowd. The photograph is in black and white, which may indicate it was taken some time ago. The code "TL 36231-6" is visible in the bottom left corner, possibly indicating an archive or catalog number.

Text analysis

Amazon

TL36231.6

Google

TL 36 2376
TL
36
2376