Human Generated Data

Title

Untitled (two photographs: group outside partially constructed house; man to right of house and car)

Date

c. 1945, printed later

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6729

Human Generated Data

Title

Untitled (two photographs: group outside partially constructed house; man to right of house and car)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1945, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6729

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Interior Design 100
Indoors 100
Collage 99.9
Poster 99.9
Advertisement 99.9
Human 98.6
Person 98.6
Person 97.6
Person 97.6
Person 96.9
Room 96.8
Person 96.7
Person 95.6
Person 95.4
Person 92
Theater 83.5
Person 80.3
Transportation 80
Vehicle 80
Car 80
Automobile 80
Head 58.9
Person 50.2

Clarifai
created on 2019-11-16

no person 97.7
people 97.5
picture frame 96.9
street 95.7
city 95.6
architecture 95.5
art 94.6
collage 94.3
window 92.8
travel 91.9
snow 91.8
many 91.7
outdoors 91.4
group 90.7
winter 90.5
movie 90.3
landscape 89.2
margin 88.6
empty 88.4
museum 88.3

Imagga
created on 2019-11-16

billboard 76.6
signboard 62.2
structure 51.1
negative 50.4
film 42.2
photographic paper 25.9
building 24.9
architecture 23.6
old 20.9
city 20.8
grunge 17.9
photographic equipment 17.3
black 16.2
window 15.4
vintage 14.9
urban 14.9
design 14.6
sky 14
travel 13.4
paint 12.7
dirty 12.7
collage 12.5
art 12.4
texture 11.8
frame 11.7
space 11.6
pattern 11.6
night 11.5
text 11.4
antique 11.3
landscape 11.2
winter 10.2
street 10.1
light 10
tower 9.9
grungy 9.5
decoration 9.5
border 9
retro 9
trees 8.9
graphic 8.8
snow 8.7
water 8.7
damaged 8.6
wall 8.6
modern 8.4
facade 8.4
color 8.3
rough 8.2
landmark 8.1
religion 8.1
scratches 7.9
frames 7.8
your 7.7
rust 7.7
tree 7.7
stone 7.7
weathered 7.6
skyline 7.6
church 7.4
digital 7.3
road 7.2
material 7.1
glass 7

Microsoft
created on 2019-11-16

text 95.8
billboard 64.2
tree 62.9
black and white 58.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 47-65
Gender Male, 50.4%
Angry 50.3%
Surprised 49.5%
Sad 49.6%
Happy 49.5%
Disgusted 49.5%
Confused 49.5%
Calm 49.5%
Fear 49.5%

AWS Rekognition

Age 7-17
Gender Female, 50.2%
Sad 49.6%
Disgusted 49.5%
Surprised 49.5%
Happy 49.5%
Angry 50.3%
Fear 49.5%
Confused 49.5%
Calm 49.6%

AWS Rekognition

Age 6-16
Gender Female, 50%
Angry 49.6%
Sad 50%
Confused 49.5%
Calm 49.7%
Surprised 49.5%
Happy 49.6%
Disgusted 49.5%
Fear 49.5%

AWS Rekognition

Age 15-27
Gender Female, 50.2%
Happy 49.6%
Fear 49.9%
Sad 49.7%
Confused 49.5%
Disgusted 49.5%
Surprised 49.7%
Calm 49.5%
Angry 49.5%

AWS Rekognition

Age 40-58
Gender Male, 50.1%
Surprised 49.7%
Confused 49.5%
Calm 49.5%
Happy 49.5%
Sad 49.5%
Angry 49.7%
Disgusted 49.5%
Fear 50%

AWS Rekognition

Age 36-52
Gender Female, 50.2%
Calm 50.2%
Fear 49.5%
Confused 49.5%
Happy 49.5%
Angry 49.5%
Disgusted 49.6%
Sad 49.6%
Surprised 49.5%

AWS Rekognition

Age 13-23
Gender Female, 50.1%
Angry 49.6%
Confused 49.5%
Disgusted 49.6%
Happy 49.5%
Sad 49.7%
Fear 49.5%
Calm 50.1%
Surprised 49.5%

AWS Rekognition

Age 23-37
Gender Male, 50.1%
Surprised 49.5%
Angry 49.6%
Happy 49.6%
Fear 49.5%
Calm 50.1%
Disgusted 49.6%
Sad 49.6%
Confused 49.5%

AWS Rekognition

Age 23-37
Gender Female, 50.1%
Sad 49.6%
Disgusted 49.5%
Fear 49.6%
Angry 49.8%
Happy 49.5%
Surprised 49.8%
Calm 49.6%
Confused 49.6%

AWS Rekognition

Age 37-55
Gender Male, 50.1%
Disgusted 49.5%
Sad 50.4%
Fear 49.5%
Angry 49.5%
Confused 49.5%
Happy 49.5%
Calm 49.5%
Surprised 49.5%

AWS Rekognition

Age 13-23
Gender Male, 50%
Confused 49.6%
Angry 49.6%
Surprised 49.5%
Calm 49.7%
Disgusted 49.5%
Fear 49.5%
Sad 50%
Happy 49.5%

AWS Rekognition

Age 23-37
Gender Male, 50.4%
Surprised 49.5%
Happy 49.7%
Sad 49.6%
Fear 49.5%
Calm 49.9%
Confused 49.7%
Disgusted 49.6%
Angry 49.5%

AWS Rekognition

Age 4-14
Gender Female, 50%
Disgusted 49.6%
Fear 49.5%
Surprised 49.5%
Angry 49.6%
Calm 49.6%
Happy 49.5%
Confused 49.5%
Sad 50.1%

AWS Rekognition

Age 13-25
Gender Female, 50.1%
Fear 49.6%
Happy 49.7%
Confused 49.5%
Calm 49.7%
Disgusted 49.5%
Sad 49.7%
Angry 49.6%
Surprised 49.7%

AWS Rekognition

Age 30-46
Gender Male, 50.2%
Happy 49.6%
Confused 49.5%
Calm 49.9%
Fear 49.5%
Angry 49.5%
Surprised 49.6%
Disgusted 49.8%
Sad 49.5%

AWS Rekognition

Age 39-57
Gender Female, 50.1%
Surprised 49.6%
Happy 49.5%
Confused 49.5%
Sad 50.1%
Angry 49.5%
Fear 49.6%
Disgusted 49.5%
Calm 49.7%

AWS Rekognition

Age 21-33
Gender Male, 50.3%
Angry 49.6%
Calm 49.5%
Sad 49.6%
Happy 49.5%
Fear 50.1%
Confused 49.5%
Surprised 49.5%
Disgusted 49.5%

AWS Rekognition

Age 23-35
Gender Male, 50.4%
Angry 49.9%
Fear 49.5%
Calm 49.7%
Sad 49.8%
Disgusted 49.5%
Happy 49.5%
Confused 49.5%
Surprised 49.5%

AWS Rekognition

Age 21-33
Gender Male, 50.2%
Fear 49.7%
Calm 49.6%
Disgusted 49.6%
Angry 49.5%
Surprised 49.9%
Happy 49.6%
Sad 49.6%
Confused 49.5%

AWS Rekognition

Age 26-42
Gender Male, 50.1%
Fear 50.4%
Angry 49.5%
Calm 49.5%
Surprised 49.5%
Happy 49.5%
Confused 49.5%
Sad 49.5%
Disgusted 49.5%

AWS Rekognition

Age 25-39
Gender Male, 50.2%
Happy 49.5%
Confused 49.6%
Calm 49.6%
Angry 50.1%
Fear 49.5%
Disgusted 49.6%
Sad 49.5%
Surprised 49.6%

AWS Rekognition

Age 23-37
Gender Male, 50.5%
Calm 49.6%
Fear 49.5%
Confused 50%
Surprised 49.5%
Angry 49.7%
Happy 49.5%
Sad 49.6%
Disgusted 49.5%

AWS Rekognition

Age 15-27
Gender Male, 50.1%
Happy 49.5%
Confused 49.5%
Calm 49.5%
Angry 49.6%
Disgusted 49.6%
Surprised 50%
Fear 49.7%
Sad 49.5%

AWS Rekognition

Age 35-51
Gender Male, 50.4%
Disgusted 49.5%
Happy 49.5%
Angry 49.5%
Fear 49.5%
Sad 49.5%
Surprised 49.9%
Confused 49.6%
Calm 50%

Feature analysis

Amazon

Person
Car
Person 98.6%
Person 97.6%
Person 97.6%
Person 96.9%
Person 96.7%
Person 95.6%
Person 95.4%
Person 92%
Person 80.3%
Person 50.2%
Car 80%

Categories

Imagga

paintings art 93.2%
text visuals 6.5%

Captions

Google Gemini

Created by gemini-2.0-flash on 2025-05-11

The image features three black and white photographs arranged on a black background. The photograph in the upper left corner shows a large group of people seated in an auditorium or theater, possibly attending a meeting or lecture. The audience fills the seats, with a few individuals visible in the balcony above.

To the right of the auditorium photo, the upper right image depicts a group of people standing in front of a building under construction or renovation. Several individuals are on the roof, while others are on the ground, suggesting they are working together on the project. Building materials are scattered around, indicating active construction.

The final photograph in the lower right corner presents a residential street scene with a two-story house visible behind bare trees. A car is parked nearby, and a person stands on the sidewalk, adding a sense of everyday life to the setting.

Created by gemini-2.0-flash-lite on 2025-05-11

This black and white image showcases a collection of photographs mounted on a black background. The background appears to have a glossy sheen.

The photographs include:

  1. Interior of a theater: This image is a medium-wide shot of a theater filled with people seated in rows. The audience faces forward, possibly watching a performance or a film.

  2. Construction site: This image depicts a group of men standing in front of a house under construction. The house appears to be a wooden structure with visible scaffolding and some men are working on the roof.

  3. House with trees: This image displays an older house on a street, with bare trees in the foreground. A car is parked on the street.