Human Generated Data

Title

Untitled (New Orleans, Louisiana)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1447

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New Orleans, Louisiana)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1447

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Face 100
Head 100
Photography 100
Portrait 100
Neighborhood 99.9
City 99.3
Road 99.3
Street 99.3
Urban 99.3
People 99.1
Person 99
Child 99
Female 99
Girl 99
Person 98.9
Female 98.9
Adult 98.9
Woman 98.9
Person 98.8
Adult 98.8
Male 98.8
Man 98.8
Person 97.9
Adult 97.9
Male 97.9
Man 97.9
Architecture 97.1
Building 97.1
Outdoors 97.1
Shelter 97.1
Machine 87.6
Wheel 87.6
Car 87
Transportation 87
Vehicle 87
Car 86.6
Clothing 80.7
Hat 80.7
Bus 79.4
Person 64.9
Suv 58
Caravan 57.8
Van 57.8
Lady 57.7
Accessories 57.5
Sunglasses 57.5
Nature 57
Sedan 56.5
Coat 56.5
Alloy Wheel 56.2
Car Wheel 56.2
Spoke 56.2
Tire 56.2
License Plate 55.7

Clarifai
created on 2018-05-11

people 100
group 99.8
group together 98.1
adult 96.8
child 96.6
several 95.4
five 95
offspring 95
man 94.9
four 94.6
sibling 93.7
administration 91.8
woman 91.8
many 91.3
portrait 91.1
facial expression 90.9
three 90.5
vehicle 88.5
wear 87.4
leader 85.1

Imagga
created on 2023-10-06

couple 40.9
happy 39.5
together 35.9
man 35.6
love 34.7
senior 34.7
male 31.4
people 31.2
grandfather 28.6
adult 26.5
outdoors 26.3
smiling 26
portrait 24.6
smile 24.2
mature 24.2
happiness 23.5
spectator 23.3
elderly 23
person 20.3
husband 20.2
park 20.1
wife 19.9
affection 19.3
old 18.8
family 18.7
sibling 18.5
fun 17.2
parent 17.2
child 16.7
grandma 16.6
lifestyle 16.6
married 16.3
mother 16.1
friends 16
women 15.8
grandmother 15.7
retired 15.5
outdoor 15.3
loving 14.3
face 14.2
father 13.9
men 13.7
outside 13.7
two 13.5
summer 13.5
retirement 13.4
marriage 13.3
togetherness 13.2
relationship 13.1
friendship 13.1
looking 12.8
affectionate 12.6
romance 12.5
enjoying 12.3
lady 12.2
pretty 11.9
attractive 11.9
aged 11.8
leisure 11.6
daughter 11.6
romantic 11.6
adults 11.3
hair 11.1
dad 11
60s 10.7
handsome 10.7
having 10.6
eyes 10.3
sitting 10.3
hug 9.7
look 9.6
boy 9.6
laughing 9.4
day 9.4
active 9.1
blond 9.1
gray 9
kin 9
vehicle 9
world 8.9
age 8.6
car 8.6
casual 8.5
camera 8.3
holding 8.2
20s 8.2
children 8.2
cheerful 8.1
group 8.1
childhood 8.1
home 8
citizen 7.9
holiday 7.9
aging 7.7
youth 7.7
close 7.4
glasses 7.4
care 7.4
life 7.4
playing 7.3
girls 7.3
relaxing 7.3
cute 7.2
grass 7.1
kid 7.1
pensioner 7.1
spring 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.4
person 97.5
posing 73.4
old 61.5
people 58.5
crowd 0.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 11-19
Gender Female, 100%
Disgusted 43.8%
Confused 18.3%
Fear 10.8%
Calm 9.9%
Surprised 9.6%
Angry 6.3%
Sad 4.5%
Happy 1.1%

AWS Rekognition

Age 6-14
Gender Female, 99.9%
Sad 92.2%
Calm 42%
Confused 7.6%
Surprised 6.5%
Fear 6%
Angry 5%
Disgusted 0.6%
Happy 0.2%

AWS Rekognition

Age 10-18
Gender Female, 99.9%
Calm 93.1%
Surprised 6.3%
Fear 5.9%
Sad 3.2%
Confused 2.8%
Angry 0.7%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 54-64
Gender Male, 99.3%
Calm 62.5%
Confused 22.9%
Sad 7.4%
Surprised 6.5%
Fear 6%
Angry 2.5%
Disgusted 2.1%
Happy 0.2%

AWS Rekognition

Age 40-48
Gender Female, 59.9%
Sad 100%
Surprised 6.3%
Fear 6.2%
Calm 3.1%
Disgusted 0.5%
Happy 0.4%
Angry 0.4%
Confused 0.2%

Microsoft Cognitive Services

Age 31
Gender Female

Microsoft Cognitive Services

Age 11
Gender Female

Microsoft Cognitive Services

Age 13
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Possible
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Child 99%
Female 99%
Girl 99%
Adult 98.9%
Woman 98.9%
Male 98.8%
Man 98.8%
Wheel 87.6%
Car 87%

Categories

Imagga

pets animals 52.8%
people portraits 25.3%
paintings art 20.3%

Captions

Azure OpenAI

Created on 2024-01-26

This is a grayscale photograph featuring a group of individuals on a street. In the background, there is a two-story building that exhibits a degree of wear and age, marked by its weathered facade and wooden shutters. The photograph appears to have been taken in a bygone era, suggested by the vintage automobile parked on the side of the road. Tall palm trees and other mature vegetation can be seen lining the street. The context of the image suggests it might have been captured in an urban environment during the early to mid-20th century.

Anthropic Claude

Created on 2024-03-30

The image appears to be a black and white photograph taken outdoors, possibly in a small town or city setting. The photograph shows a group of people, including several women and children, standing on a sidewalk or street. One woman in the center of the image has a serious expression on her face, while the other individuals in the group have a range of expressions. In the background, there is a wooden building with a porch, and a vintage-style automobile can be seen parked on the street.